Despite overwhelming scientific evidence showing that COVID-19 vaccines are a safe and effective means to prevent severe cases of a disease that has killed nearly one million people in the U.S., there has been a proliferation of false and misleading claims trying to undermine the public’s confidence in their safety and uptake.
As the pandemic continues to be an ongoing health emergency with new variants rapidly spreading, it is increasingly urgent that accurate vaccine-related information be accessible and readily available to the public, said Stanford scholar and leading expert on mis- and disinformation, Renée DiResta.
Throughout 2021 and into the present, DiResta’s team at the Stanford Internet Observatory (SIO) has been working hard to detect and disrupt mis- and disinformation related to the COVID-19 vaccines in real-time as part of her work leading the Virality Project, a multi-year effort between SIO and five other research groups. Their collaboration has culminated in a new report, Memes, Magnets and Microchips: Narrative dynamics around COVID-19 vaccines, that offers specific recommendations for how public health officials, social media platforms and other academic institutions can counter and curb the spread of false or misleading information that has a potential negative impact on individual or public health.
“There are very real-world impacts to vaccine hesitancy at this moment in time,” said DiResta, the technical research manager at SIO. “Myths, rumors and disinformation contribute to vaccine hesitancy, and thinking about our information environment is part of understanding the public health quandary that we find ourselves in.”
The Virality Project grew out of prior research DiResta was involved with at SIO and the University of Washington’s Center for an Informed Public (UW), as well as Graphika, a social network analysis firm, and the Atlantic Council’s Digital Forensics Research Lab during the 2020 U.S. election to address electoral-related mis- and disinformation.
When mis- and information about vaccination started going viral at the start of the coronavirus pandemic – long before a vaccine against COVID-19 had even been developed or approved – the team reassembled (and added new members NYU Tandon and the National Conference on Citizenship) to apply what they learned during the 2020 U.S. election to the COVID-19 crisis that was unfolding all around them. The team also wanted to better understand what makes anti-vaccination rhetoric, which has existed as long as vaccines themselves, distinct from other forms of mis- and disinformation, which DiResta started tracking when she co-founded Vaccinate California in 2015, a parent advocacy group working to improve public health in California by raising vaccination rates.
Tackling real-time disinformation about the COVID-19 vaccines
Since the spring of 2020, the Virality Project has investigated some 900 incidents across major social media platforms in the U.S., including Facebook, Instagram, Twitter, Reddit and TikTok, as well as newer online spaces such as Gab, Parler, Telegram and Gettr. In addition, posts in Spanish and Mandarin Chinese – the two most spoken languages in the United States after English – were also analyzed by language specialists.
The Virality Project has convened over 30 weekly briefings, provided in-depth policy analysis and issued Rapid Response reports that examined overarching trends and tactics used within the anti-vaccination community and its influential supporters on how to stop rumors and lies from spreading even further.
In Memes, Magnets and Microchips, the researchers break down what they’ve learned from their various narrative tracking efforts. For example, they detail the various ploys online influencers use to undermine health experts, like how they sow doubt by asking their followers to “do their own research” on already well-established scientific facts, or by highlighting incredibly rare events or miscasting statistics to give the impression such occurrences are more common and harmful than they actually are.
“They’re framed as if large numbers of people are being injured by the vaccines. And that’s not true: far, far greater numbers of people are dying from COVID,” said DiResta. For example, anti-vaccination influencers, like Robert F. Kennedy, Jr and Joseph Mercola, have cherry-picked unverified reports from the Vaccine Adverse Event Reporting System (VAERS), the public health database where people self-report adverse side effects from vaccination, in a way that misrepresents vaccine safety.
“For people who don’t spend all their time looking up or understanding how vaccine trials or vaccine adverse event reporting systems work, these statistics and numbers can be very scary,” said DiResta.
In an information-saturated world, it can be hard for people to distinguish between established facts and speculation. This was especially true in the early days of the pandemic, when there was so much uncertainty about how to respond to this new, emerging disease. For example, there was even a brief debate about whether face coverings were an effective strategy to curb community transmission of COVID-19. At first, the Center for Disease Control and Prevention (CDC) said people didn’t need to wear masks. They didn’t change their stance about masking until early April 2020, and by that point, they already faced the challenge of countering a vast amount of misinformation, including their own inaccurate messaging, DiResta pointed out.
Slow responses like this have led the public distrust of public health officials, said DiResta.
“Even people who have not taken the vaccine are not necessarily anti-vaccine. Many of them are hesitant,” she said. “They don’t know who to trust or what to believe. Understanding how information reaches them, what sources they trust and continue to share along, is important for understanding how to address false and misleading claims on social media in the long term.”
Adding further to the issue has been a growing partisan divide about the country’s handling of the pandemic. The anti-vaccination community and other influential proponents, including conservative news broadcasters, took advantage of this chaotic, politicized news environment to further undermine the public’s confidence in authoritative sources.
What can be done?
Memes, Magnets and Microchips provides tailored recommendations that the academic research community, social media platforms and public health officials can follow to mitigate the deleterious effects of mis- and disinformation.
Go to the web site to view the video.
Overall, there are three things that must happen, DiResta said.
One is finding a trustworthy messenger to deliver accurate information to people still hesitant about the vaccines. “The messenger is very, very important – particularly at a time when people have lost a lot of confidence in the government and traditional authorities,” she said.
That person could be someone from within their own community with similar lived experiences and background, and therefore better able to connect on a deeper, personal level. This is particularly important in underserved or marginalized communities. For example, when the Stanford Cyber Policy Center and the Virality Project hosted Surgeon General Vivek H. Murthy for a panel discussion about vaccine misinformation, one of the panelists, Katrina Rudolph, who runs a hair salon in a predominantly Black community, shared how she uses her role as a local hairdresser to talk supportively with her clients, colleagues and friends about how they can arrive at informed decisions about their health, like seeking information about the vaccine from their healthcare provider and not what they see on social media.
It is also important that people combatting mis- and disinformation have a clear understanding of what exactly is spreading across social media – whether it is specific allegations or an emotional resonance point – so that they can respond in a way that directly addresses specific concerns a vaccine-hesitant person may have. “Just saying ‘the vaccines are safe and here are the stats’ is not enough,” said DiResta. “It doesn’t address the root concern that people have.”
Finally, the format of the message is important too – particularly one that will lend itself to a social media environment. For example, releasing a PDF or holding a press conference that details scientific information in a cold and matter-of-fact way might not be a suitable medium for a chatty, digital world. DiResta suggests communities get creative with how they promote their messages, like coming up with a clever meme or TikTok video, for example. An effective example of this was the #ThisIsOurShot grassroots campaign, which involved frontline healthcare workers – who are considered to be some of the most trusted sources of health information – sharing their own personal experiences getting vaccinated on social media.
Ultimately, given the dynamic nature of social media and the divisive political environment, the scholars say members from across society, including government, academia, civil society, individual citizens and industry, must be united in addressing dis- and misinformation in real-time.
“The fight against misinformation is only beginning,” the authors of the Memes, Magnets and Microchips conclude. “The collective effort must continue.”
Executive editors of the report are Renée DiResta, Elena Cryst and Lily Meyersohn, all affiliated with Stanford University.
Stanford staff contributing to the report include: Carly Miller.
Stanford students that also helped on the report include: Toni Friedman, Zoe Huczok, Lindsay Hurley, Chase Small, Abigail Tarquino and Julia Thompson.
Other contributors from outside Stanford are: Taylor Agajanian, Julienne Ching, Osiris Cruz-Antonio, Kaitlyn Dowling, Laura Edelson, Kris Fortmann, Cameron Hickey, Alyssa Kann, Kolina Koltai, Kathy Liu, Rachel Moran, Kyle Weiss, Matt Masterson, Erin McAwee, Lily Meyersohn, Carly Miller, Rachel Moran, Chase Small, Abigail Tarquino, Julia Thompson and Kyle Weiss.
Media Contacts
Melissa De Witte, Stanford News Service: mdewitte@stanford.edu
Ari Chasnoff, Freeman Spogli Institute for International Studies: chasnoff@stanford.edu