Skip to main content

Disinformation investigators: Stanford students sleuth for false, misleading reports on how to vote

Ahead of the 2020 election, Stanford students investigate the spread of mis- and disinformation online as part of their work with the Election Integrity Partnership.

When it was revealed this fall that photos circulating on social media showing unopened ballots tossed in a Sonoma County, California dumpster were not from the 2020 election but were actually images of empty mail-in ballot envelopes being legally discarded in 2018, a team of Stanford students sprang into action.

Go to the web site to view the video.

Kurt Hickman

Ahead of the 2020 election, Stanford students are investigating the spread of mis- and disinformation online as part of their work with the Election Integrity Partnership. The effort is part of a coalition that includes the Stanford Internet Observatory, the Atlantic Council’s Digital Forensic Research Lab (DFRLab), Graphika, and the University of Washington’s Center for an Informed Public.

They wanted to find out how far the bogus claim had spread online – was it contained mostly to Facebook and Twitter, or were there people sharing it on other popular platforms like Instagram, Snapchat, TikTok or NextDoor? They also wanted to know how many people had shared it and whether any public figures involved whose verified, “blue-check” status could lend credibility to the misinformation? Also, more generally, what does this one incident reveal about how a false narrative around fraud and mail-in voting is constructed?

These efforts to track the spread of misinformation are part of the Election Integrity Partnership (EIP), a joint effort between Stanford and three other research entities experienced in studying online disinformation. Their mission is to find and mitigate attempts to interfere with the 2020 election by identifying, in real-time, misleading content that has the potential to prevent or deter people from voting or delegitimize election results.

If the team is concerned by the publicly-available content they see, they report it directly to the social media platforms and ask them to consider labeling it with either voter resource guides or misleading content labels. In some cases, if it is clear that that content violates platform policies on election information, EIP will recommend the post be removed.

“What we’re trying to demonstrate is if you come up with really specific criticisms, scenarios that we think are probably going to happen and reasons why these platforms’ policies don’t cover them, then that gives them a specific target to aim for, and I think you can have a real impact,” said Alex Stamos, who leads Stanford’s involvement of the EIP through the Stanford Internet Observatory, an interdisciplinary program run under the auspices of the Cyber Policy Center at the Freeman Spogli Institute of International Studies (FSI), which studies the abuse of information technologies, particularly on social media platforms.

Social media sleuthing

Since the EIP launched in September, a team of 30 Stanford students – called “tier-one research analysts” – have responded to over 500 reports – dubbed “tickets” – of misleading or false content. These tickets come either from reports by trusted civil society organizations that EIP works with, such as The National Association for the Advancement of Colored People (NAACP) and the American Association of Retired Persons (AARP), or through the team’s own monitoring of publicly-available social media content.

Once a ticket is submitted, analysts answer about 12 different questions about the reported content.

“The first question we really want to know is what’s the reach? That helps us determine how severe a specific issue is,” explained Elena Cryst, the assistant director of the Stanford Internet Observatory. In addition to documenting and archiving where the misinformation appears on major social media networks, analysts also record how many shares, likes or comments the content has received in each platform. “Our concern is looking at how this is affecting people across the country,” Cryst said.

Analysts work in four-hour shifts with 20 hours of coverage a day, seven days a week scanning information that is exchanged publicly (the team does not examine information in private, closed groups).

“We’re trying to find incidents of mis- or disinformation before they start to go viral. If we get there and something already has millions of views, it’s kind of too late – a lot of people have already seen it,” said Isabella Garcia-Camargo, a computer science co-term who took a leave of absence to work full-time as the product manager for EIP and has been the driving force behind the project.

Tickets analysts have examined social media posts about compromised voting machines and tampered software as well as well-intended but misleading information about domestic violence abuse victims and public voter registration records.

Sometimes identifying misinformation is not entirely straightforward – what is legal in one state may not be permissible in another, such as delivering or collecting absentee ballots on behalf of another person, a practice sometimes referred to disparagingly as “ballot harvesting.” In other instances, it’s not always apparent where the content fits into a platform’s policy about free speech and political expression.

“Some content is so borderline it’s really hard to apply even very clear principles,” said Cooper Raterink, a tier-one analyst and a Stanford graduate student in computer science, adding that automating the process through an algorithm or other technological method would be really difficult. “What needs to be done is putting a lot of people on it, and a lot of thought as to what we want from these platforms online.”

Broader investigations

When the team identifies a ticket that could have contributed to broader issues like voter suppression or the integrity of election results, they conduct a deeper analysis with their partners at the EIP, who include the visualization firm Graphika, the research organization Atlantic Council’s Digital Forensic Research Lab and the University of Washington’s Center for an Informed Public (UW).

“We try to build out a bigger picture of how concerning this may be and if we need to develop any counter-narratives, talk to the media or publish a post on our own blog platform discussing what we’ve seen and how this content may be misconstrued to the general public,” Cryst said.

For example, using the “ballot dumping” incident in Sonoma County as a case study, Stanford and researchers at UW wrote a comprehensive report for the EIP blog analyzing how major spreaders of false reports this election have come from verified, domestic influencers.

According to Stamos, while foreign interference was a problem in 2016, it is minimal compared to amplification from actors within the U.S. who are priming people with a narrative that this election is being “stolen.”

“The biggest thrust of disinformation this year is to convince Americans that there’s some kind of conspiracy afoot to steal the election,” Stamos added. “Verified domestic influencers are by far the biggest source of the amplification of this disinformation. Although some of the narratives have come from foreign sources, for the narrative to get wide distribution, they usually go through domestic political actors who have very large followings online.”

Importance of double-checking information

While rumors during election season have always existed, what is different today is the potential for millions of people to see it in a very short amount of time, Stamos pointed out.

Another source of misinformation spread are unverified reports from “friend of a friend” – think posts on social media like “my cousin went to vote and his vote was switched by the machine” or “my friend works for the post office and said that they’re burning ballots from a certain political party.”

“That kind of stuff is almost never true,” Stamos said. His advice is to wait for legitimate reports to emerge in credible news outlets written by journalists experienced in fact-checking before sharing these kinds of stories. As the example in Sonoma Country demonstrated, context matters: while the image was real, the context it was used in was not.

“To someone who wants to get involved and help out with curtailing the spreading misinformation online I would say, first, make sure that you’re not part of the spread,” Raterink said.

Garcia-Camargo sees a continuing role for watchdog entities like the EIP beyond the 2020 elections to help protect the democratic process.

“I think that, more and more, we are seeing people wanting to make sure that people are able to get the facts and get the right stories and to understand why the information that they’re seeing is being put in front of them,” Garcia-Camargo said. “I really hope the EIP and the involvement that the students were able to have in it inspires people to work in the space after the election is over.”

For more information about the EIP and the issues they have documented about mis- and disinformation in the 2020 election, visit their website: eipartnership.net and follow them on Twitter @2020partnership.