In the last decade or so, much handwringing has transpired over “fake news” – stories that are not factually accurate that seem to get wedged in the popular psyche and stick. Many now accept the conventional wisdom that fake news spreads “farther, faster, deeper and more broadly than the truth.”
“The online spread of fake news and misinformation is clearly a major problem,” said Johan Ugander, an assistant professor of management science and engineering at Stanford University, who studies the phenomenon. “Most agree that the world would be better with less misinformation around. The question is: What can we do to stop it?”
To develop new and more successful techniques to squelch the spread of fake news, Ugander says we need to understand how misinformation spreads, a subject he explores in his most recent paper, published Oct. 25 in the journal Proceedings of the National Academy of Sciences, which refines the conventional wisdom and offers up new hope of snuffing out fake news without eliminating social media altogether.
Deep and wide
Information, both good and bad, can spread in many different ways. First, it can travel broadly, with single individuals passing it along to a great number of people. Online content often achieves breadth by spreading through network “hubs” – sources with a great number of followers.
Alternatively, information can also travel deeply, flowing through networks of people who not only consume it but then pass it on to others who also pass it on, as in a game of telephone.
Getting a handle on how fake news spreads is important for curbing it. If it’s true that fake news spreads more broadly than real news, social media platforms looking to counter misinformation could take steps to reduce the role of hubs by, for example, demoting them in their feed algorithms. But if fake news spreads more like a game of viral telephone, long chains of diffusion should be a red flag for social media platforms. “In that scenario, one technique for limiting deep diffusions would be to warn users about stories that have been passed along through long chains of readers,” Ugander said.
In their research, Ugander and his co-author, Jonas Juul at Cornell University, show that neither model fully captures how fake news actually spreads.
Fake is infectious
In the new study, Ugander and Juul took a large trove of Twitter data and broke it up by size, matching true and false news stories that had reached the same number of people. What they found was that if given two stories, one true and one false, that both reached 100 people, it was impossible to distinguish one from the other by looking at just their diffusion, or “cascade,” patterns.
In other words, fake news didn’t spread more broadly or more deeply than real news. But it did spread more easily because it was more infectious – for reasons that are still unclear, people are more likely to share a piece of fake news than real news.
The findings have important policy implications, the researchers argue, because it shows that differences in the spreading patterns of fake and real news are a red herring. “The differences in the patterns can all be explained by the more fundamental fact that fake news spreads person to person at a higher rate,” Juul said.
The researchers hope this finding will simplify and focus the policy debate. “Based on our findings, the two policy suggestions mentioned earlier – restraining hubs and flagging long chains – would be ineffective,” Ugander said. “First, it would slow the spread of both false and true news equally. And shutting down a hub would be the news equivalent of closing down concerts and sporting events, an overly cautious and potentially unpopular approach.”
Rather, he added, the policy debate should instead focus on limiting the basic infectiousness of fake news.
Ugander cited positive findings in recent work by a team of psychologists that encouraged people to consider the accuracy of content before sharing it. This simple tactic achieved broad reduction in the transmission of false news but not real news. “This tactic is the equivalent of implementing a mask mandate to slow the spread of COVID-19 – it allows people to socialize but still limits the spread of the virus,” Ugander said.
Another option might be to help readers identify fake news and dubious sources to begin with through stronger civics and digital literacy education that lowers the infectiousness of false news while furthering the societal benefit brought by positive infections of factual news.
“There’s a really easy way to stop the spread of fake news: stop the spread of all news,” Ugander said. “But we probably don’t want that. We want people to share news, but to share real news over fake.”
To read all stories about Stanford science, subscribe to the biweekly Stanford Science Digest.
This work was supported in part by an ARO MURI award.
Media Contacts
Ker Than, School of Humanities and Sciences: (650) 723-9820; kerthan@stanford.edu