If any group understands the toll misinformation can take on the public understanding of science, it’s climate scientists. For years, they have been trying to convey the findings from a ceaseless stream of studies showing the world is warming, while combating misinterpretations and outright fake news. A similar infodemic—a surplus of information both legitimate and misinformed—now plagues the COVID-19 outbreak.
In the internet era, when research papers are readily available, everyone can become an expert on COVID-19 or climate change. But pundits can also cherry-pick the data that matches their beliefs and seem to speak with authority. These types of personalities appear in traditional media such as television, but their work truly thrives on social and video-streaming platforms. Part of the reason is social media remains largely unregulated, and the attention—the “likes” and engagement—we receive on a post can incentivize us to share.
“It feels like we’ve been living in a world of misinformation for a few decades, but the amplification and reach is out of this world with new platforms,” says Sarah Evanega, the director of Cornell University’s Alliance for Science, an organization dedicated to correcting misconceptions.
And this is also a time of intense partisanship, when people tend to look to their political leaders to help them decide how to think about issues, including science. This reliance on political leanings can make people susceptible to unscientific arguments.
“People say, Well, Europe is opening schools, so why aren’t we opening schools?” or they compare COVID-19 to the flu, says John Cook, a communications expert at George Mason University who studies climate change misinformation. “Those kinds of analogies are very simplistic and misleading.”
For many people, climate change and COVID-19 feel remote, so these seemingly invisible threats create a psychological distance. This can cause people to undervalue the potential danger and make the solutions seem worse than the problem itself.
“We’re told the solutions are worse than the impacts: ‘Destroy the economy, turn the country socialist,’” says Katharine Hayhoe, a climate scientist at Texas Tech University. “These are the things people say to avoid climate action, and of course that’s not true at all."
Misinformation may feel overwhelming, but there are ways to fight it, say those who study its pervasive reach. By recognizing what it looks like and where it comes from, experts say we can help set the facts straight.
Setting the stage
Worldwide, scientists have published tens of thousands of studies on COVID-19 this year at a breakneck pace. While not experiencing as dramatic an uptick, studies of climate change increased exponentially from 1951 until the end of the millennium, doubling in number every 11 years. This pattern has accelerated this century as the dangers from climate disasters become more apparent.
But to disseminate information about COVID-19 to public authorities as rapidly as possible, scientific journals are under pressure to rush the careful vetting normally required to publish new science.
“We have published [studies] within a week of submission,” says Jennifer Zeis, the director of communications at the New England Journal of Medicine. “This is unusual—we’re not a breaking news organization, and this is a big stretch for our resources.”
Under COVID-19, this push for life-saving information also led to a flurry of articles appearing on what are known as preprint servers. These online platforms allow researchers to share their work almost as soon as the experiments are done, unlike academic publications that are more exclusive and require a time-consuming review process from a would-be author’s scientific peers.
“A tremendous amount of important science [on COVID-19] has appeared. It has been quite unprecedented,” says John Inglis, cofounder of medRxiv, the largest medical preprint server. “Obviously, some of it is wrong.”
Pronounced “med archive,” this particular preprint server has been flooded with new research about SARS-CoV-2. In January, the site posted 390 papers on various subjects, but by May, that number had jumped to 2,200, most about COVID-19.
MedRxiv employs a screening process to make sure a submitted research paper includes results, and is not an editorial or unsubstantiated hypothesis. The site then runs a plagiarism check and examines whether the paper makes a harmful claim. But, Inglis says, this process doesn’t determine whether the study is accurate, reliable, or prone to misinterpretation.
Inglis views preprint servers as a natural part of the scientific process, but he says their audience has likely expanded beyond just academics as COVID-19 has sparked lay interest. In May, medRxiv had 10 million page views and around six million downloads. That means people who are not experts can access the papers, interpret them in ways that fit with their existing beliefs, and then share their opinions with others.
“Preprint is not evil and peer review is not perfect. It’s all very gray,” says Ivan Oransky, a medical journalism lecturer at New York University and the co-founder of Retraction Watch, a news database dedicated to highlighting when studies are retracted or corrected.
Out of around 50,000 coronavirus papers and preprints published since January about SARS-CoV-2 and COVID-19, Retraction Watch has tracked 36 retracted COVID-19 papers, but Oransky notes it usually takes around three years for a study to be corrected.
He offers one piece of advice for how to consume COVID-19 news: Don’t rely on a single study to provide the whole truth, but rather form judgments after a series of studies coalesce around a consensus.
Fighting an uphill political battle
One of the biggest predictors of whether someone is likely to disavow climate change or COVID-19 is political affiliation. Cook’s research has shown that political leaders can significantly influence a person’s attitude about climate change, and he suspects the same is true for COVID-19.
For example, a number of polls and think tank research show an overwhelming majority of Democrats are more likely to take COVID-19 seriously, wear masks, and social distance, while a minority of Republicans are likely to do the same. This political polarization was an “avoidable tragedy,” Cook says, pointing to President Donald Trump’s early and persistent dismissal of wearing masks and social distancing as major factors driving today’s partisan divide.
“When our tribal leaders send us cues, the tribe tends to move in that direction,” Cook says. “Leadership matters.”
In an early April analysis of COVID-19 misinformation, researchers at Oxford University found that while the majority of fake news about the pandemic is spread by average social media users, top politicians or celebrities receive more attention and engagement on their posts.
“A single non-expert with a large platform, whether they’re a celebrity or a political figure, can have a disproportionate effect on the population,” Evanega says.
That’s been especially true for COVID-19. In a study published in September, Evanega and her team analyzed a database of 38 million pieces of English-language content published between January 1 and May 26. They found just over a million news articles that either spread or reported on misinformation related to the pandemic.
The most popular misinformation centered on miracle cures—drugs with no proven clinical benefit that are nonetheless touted as effective. Notably, her team also found that the president was the primary individual driver of misinformation, showing up in 38 percent of those misleading articles, and the biggest spikes in misinformation came when he made pronouncements about COVID-19 remedies.
Even when politicians are not spreading misinformation, people can struggle to discern what is real. In a study published in late June in Psychological Science, scientists recruited 1,700 adults to track what influenced their likelihood to share COVID-19 misinformation on social media.
Two groups were presented with headlines perpetuating false information about the pandemic. The first group was asked how likely they were to share the news, while the second group was asked to determine the headline’s accuracy. Comparing the two groups, 32 percent more participants were willing to share a misinformed headline than they were to rate it as accurate.
However, in a second experiment, study participants were asked to judge whether a headline was accurate before sharing it. Researchers found that this small nudge to think critically made study participants three times more likely to spot misinformation.
“On social media, people are focused on how much their friends and followers are going to ‘like’ their posts—the amount of positive social reinforcement they are going to get—rather than accuracy,” says David Rand, one of the report’s authors and a researcher at the Massachusetts Institute of Technology who studies the decision-making behind the spread of misinformation.
A path forward
To combat scientific misinformation, Cook recently developed the prototype for a game that explains different misinformation tactics. By exposing players to what misinformation looks like, they’re taught to think critically and can better identify misinformation later on.
This strategy works in a lab, he says, but he’s unconvinced that misinformation can be combated on a global scale.
“I’m a bit of a pessimistic person by nature, but having worked in climate denial for 15 years now and seen and heard horrible things, I’m seeing those same dynamics play in with COVID in 2020,” he says. “COVID denial is climate denial on fast forward.”
Hayhoe is slightly more optimistic and continues to actively communicate her climate research at talks and on social media. In 2018, she gave a TED Talk—that’s now been viewed 3.6 million times—about communicating climate science to people who are skeptical of science. She believes productive conversations are possible.
First and foremost, she says, “there has to be mutual respect.” Both sides must find common ground—“something we can agree on”—that helps move toward a positive solution.
“The COVID-19 pandemic is a real dire demonstration of how misinformation has real world and immediate consequences on public health,” Evanega adds. “It really is a matter of life and death.”