Skip to main content
ABC News
How Bad Is The COVID-19 Misinformation Epidemic?

The COVID-19 pandemic has been equally defined by how much we don’t know as how much we do. That vacuum has been quickly filled with bullshit. The United Nations secretary-general has warned we’re living through “a pandemic of misinformation,” and the head of the World Health Organization said it’s an “infodemic.” In the midst of battling a global health emergency, we find ourselves fending off another scourge of conspiracy theories and misinformation.

It certainly feels like there’s a lot of fake news swirling around about the coronavirus, but how does it compare to another major misinformation magnet: the 2016 election? Research on coronavirus misinformation is largely limited to public opinion surveys and preprint research that has yet to be peer-reviewed. But when we compare those preliminary findings to research on the 2016 election, they suggest that more people are seeing — and believing — misinformation now, and it may have something to do with the challenge of understanding a new disease.

Measuring exactly how much bunk is out there to begin with is a challenge, in part because so much misinformation is shared through social media, said Gordon Pennycook, a behavioral psychologist at Canada’s University of Regina who studies fake news. It’s possible to measure, for instance, the number of tweets linking to specific fake news websites, but no way to see every instance a particular false claim is made on Facebook, especially when those claims can take many forms, including memes, Pennycook said.

An easier measure is how many people recall having seen fake news. A Pew Research Center survey conducted in the second week of March found that 48 percent of Americans reported seeing at least some made-up news about the outbreak. Only 20 percent of respondents said they had seen no fake news; the remaining 32 percent said they had seen “not much.” In a preprint paper from researchers at Cornell University, respondents were simply asked if they remember seeing certain claims about coronavirus, some of which were true and some of which were false. About 7 percent of respondents accurately recalled seeing a false claim.1

But just because you’ve seen fake news doesn’t mean you believe it. I’ve seen plenty of bogus claims on my Facebook timeline, myself, and I haven’t started drinking bleach yet. So far, research indicates the number of people who actually believe these ideas is dependent on the claim. In that Cornell paper, the percent of respondents who both recalled seeing fake news and believing it ranged from 14 percent to 19 percent. This represents a higher percentage of the population actually believing fake news than we saw in 2016. One study published in the Journal of Economic Perspectives found that only 8 percent of Americans both recalled seeing and believing a piece of fake news about the 2016 election.

“So, on average, double the rate of people recalled and believed in fake news in the COVID context compared to what we saw in 2016,” said Douglas Kriner, a government professor at Cornell and one of the authors of the preprint.

Other studies have found a wider range in how willing people are to believe in something. A preprint from Pennycook asked Americans2 in late March whether they believed specific claims about COVID-19, without indicating whether they were true or false. The percent of Americans who said they believed false claims ranged from less than 1 percent (the claim that eating garlic can cure the coronavirus, even though it cannot) to just over 20 percent (the claim that seasonal flu was just as dangerous as COVID-19, even though the flu is far less deadly). And a survey from Pew Research Center conducted in mid-March reported nearly 3 in 10 Americans believe the coronavirus was created in a lab, either accidentally or on purpose. (The evidence heavily points to the virus emerging naturally.)

Many Americans are also having difficulty discerning accurate information. For example, in the Cornell paper, when shown headlines about treatments for COVID-19, 40 percent of respondents, on average, judged real headlines in this category to be true, and the remaining 60 percent were “almost evenly divided between identifying the headline as false or acknowledging that they were unsure.”

“If people are unable to discern and say that true information is true, that is a real problem from a public health perspective,” said Sarah Kreps, a government and law professor at Cornell and a co-author of the paper.

In another preprint co-authored by Pennycook, respondents were able to distinguish between accurate and false information about COVID-19 pretty well when prompted to. But when asked only whether they would share the information on social media — and not if they believe it to be true or not — more were willing to share fake news.

“They still only believe true content like 65-ish percent of the time and the false content they believe like 25 percent of the time, so not great but at least there’s a difference,” Pennycook said. “If instead you ask them which ones they would share on social media, they’re terrible at discerning between them. They don’t, basically.”

One of the reasons we might be seeing more people falling for misinformation is the knowledge gap that comes with an emerging disease like COVID-19. There’s a lot we still don’t know about this virus, and that lack of understanding can create a vacuum that is all too easily filled by conspiracy theories and misinformation.

“An analogy that might be helpful here is autism: science still doesn’t have a clear explanation of why autism occurs,” said Brendan Nyhan, a government professor at Dartmouth College who studies disinformation. “That’s created a demand for explanations of why children develop it that anti-vaccine activists have taken advantage of. They’ve provided a simple story of why children develop autism, taking advantage of the coincidence in time between when autism manifests and when children get vaccines.”

So what do we do while we wait for better scientific answers on COVID-19? Social media sites, for their part, have enacted measures to limit and remove false information about COVID-19. Facebook, YouTube and Vimeo, for example, all removed the viral video “Plandemic” that claimed a shadowy group of elites was using the coronavirus to gain power. But an analysis from the nonprofit activist organization Avaaz found that it can take up to 22 days for Facebook to label fact-checked COVID-19 misinformation on the site, and that it can spread to millions of users in the meantime. And MIT researchers have found that misinformation peddlers can get around Facebook’s efforts to quell fake news by using an archived version of a URL via the Wayback Machine.

Studies have also found mixed results when it comes to issuing corrections on misinformation. One study on disinformation around Zika and yellow fever outbreaks in Brazil, published earlier this year in Science Advances and co-authored by Nyhan, found that priming people with a factsheet of accurate information had no significant impact on people’s beliefs in subsequent false claims.

One of Pennycook’s papers suggests that merely prompting people to think about the fact that not everything online is true can be enough to reduce the amount of misinformation they share. But each of the researchers I spoke to expressed concern about people’s ability to discern accurate information in a pandemic, and what impact the deluge of bunk might have on making sure the truth gets communicated.

“The problem is we can’t say how much fake news is reinforcing beliefs and entrenching beliefs and causing people, almost, to enter this domain of nihilism where they just throw up their hands,” Kreps said. “They don’t know what to believe, so they’re not going to believe anything. Not believing anything can be as pernicious as believing fake news.”

Footnotes

  1. A lot more people claimed to remember fake news, but this figure was adjusted because many people also claimed to remember fake news that the researchers had made up just for the study, suggesting people recall seeing fake claims they’ve never actually come across.

  2. The survey, which was also conducted in Canada and the United Kingdom, was preregistered but not nationally representative.

Kaleigh Rogers is FiveThirtyEight’s technology and politics reporter.

Comments