Two Ohio State researchers are trying to determine how people respond to misinformation about the COVID-19 pandemic and how to correct it.
Thomas Wood, an assistant professor of political science who researches misinformation, and Kelly Garrett, a professor of communication who specializes in political misinformation, are researching different aspects of COVID-19 misinformation. Wood’s research aims to understand how effective fact checks are on COVID-19 misinformation, and Garrett’s explores how easily people can tell false and factual claims apart.
“It turns out that people are very receptive to succinct fact checks, even on highly divisive, fraught, polarized political questions,” Wood said. “Success in that area has led us to work with social media companies, search companies and news organizations to figure out if we can measure the effectiveness of fact checks.”
Wood’s research involves surveying three groups of people from other countries and asking the first group whether they believe in a given vaccine conspiracy, showing the second group vaccine misinformation and asking if they believe the conspiracy, and showing the third group corrected misinformation and asking whether they believe the conspiracy. The first group will act as a control to determine if misinformation and fact checking have an effect on people’s beliefs.
Google, which provided the funding for Wood’s research as part of a $450,000 grant, will use the findings to help news organizations effectively cover the pandemic, he said.
“We have an interest in the maximum extent possible improving people’s factual understanding about the vaccine,” Wood said.
Both Wood and Garrett said false information surrounding the coronavirus vaccine has exploded on social media sites, especially Facebook.
“Low-credibility news has been able to capitalize on social media,” Garrett said. “News sites that would otherwise have been probably relatively unknown and had relatively little use found big audiences on Facebook.”
Garrett’s research surveys a group of Americans and asks them to differentiate examples of factual claims about COVID-19 on social media from false ones. He said his team will use the information to try to find out why some people are better at distinguishing factual claims than others and if these differences predict their behavior relating to COVID-19.
In 2018, MIT researchers found that false news travels faster than real news on Twitter. One theory that explains this is called the novelty hypothesis.
“It may be, in part, that what we’re seeing is that falsehoods get more attention especially when they are unusual or unique or shocking,” Garrett said. “People are drawn to shocking headlines and shocking claims.”
Wood said because social media companies have controlled other content on their platforms such as hate speech and election misinformation, there is precedent for them to limit false claims about the coronavirus.
On Feb. 8, Facebook banned misinformation about all vaccines and expanded its ban on COVID-19 misinformation to include claims such as the idea that the virus was human-made or that getting COVID-19 is safer than the vaccine. Twitter announced March 1 it would put labels — similar to those used to mark election misinformation — on tweets that contained vaccine misinformation.
Despite the amount of research already done, Garrett said there is still little information about social media’s effect on how informed people are, nor are there definitive findings on if misinformation influences people’s behavior.
To avoid spreading misinformation online, Wood recommends people make sure they stay factually informed by reading scientific research on COVID-19 and using that knowledge to counter false claims on social media.
“Even if it generates a negative response, it’s probably more persuasive than you expect,” Wood said.