New! Sign up for our email newsletter on Substack.

Evaluating the truthfulness of fake news through online searches increases the chances of believing misinformation

Conventional wisdom suggests that searching online to evaluate the veracity of misinformation would reduce belief in it. But a new study by a team of researchers shows the opposite occurs: Searching to evaluate the truthfulness of false news articles actuallyย increasesย the probability of believing misinformation.

The findings, which appear in the journalย Nature, offer insights into the impact of search enginesโ€™ output on their usersโ€”a relatively under-studied area.

โ€œOur study shows that the act of searching online to evaluate news increases belief in highly popular misinformationโ€”and by notable amounts,โ€ says Zeve Sanderson, founding executive director of New York Universityโ€™s Center for Social Media and Politics (CSMaP) and one of the paperโ€™s authors.

The reason for this outcome may be explained by search-engine outputsโ€”in the study, the researchers found that this phenomenon is concentrated among individuals for whom search engines return lower-quality information.

โ€œThis points to the danger that โ€˜data voidsโ€™โ€”areas of the information ecosystem that are dominated by low quality, or even outright false, news and informationโ€”may be playing a consequential role in the online search process, leading to low return of credible information or, more alarming, the appearance of non-credible information at the top of search results,โ€ observes lead author Kevin Aslett, an assistant professor at the University of Central Florida and a faculty research affiliate at CSMaP.

In the newly publishedย Natureย study, Aslett, Sanderson, and their colleagues studied the impact of using online search engines to evaluate false or misleading viewsโ€”an approach encouraged by technology companies and government agencies, among others.

To do so, they recruited participants through both Qualtrics and Amazonโ€™s Mechanical Turkโ€”tools frequently used in running behavioral science studiesโ€”for a series of five experiments and with the aim of gauging the impact of a common behavior: searching online to evaluate news (SOTEN).

The first four studies tested the following aspects of online search behavior and impact:

  • The effect of SOTEN on belief in both false or misleading and true news directly within two days an articleโ€™s publication (false popular articles included stories on COVID-19 vaccines, the Trump impeachment proceedings, and climate events)
  • Whether the effect of SOTEN can change an individualโ€™s evaluation after they had already assessed the veracity of a news story
  • The effect of SOTENย monthsย after publication
  • The effect of SOTEN on recent news about a salient topic with significant news coverageโ€”in the case of this study, news about the Covid-19 pandemic

A fifth study combined a survey with web-tracking data in order to identify the effect of exposure to both low- and high-quality search-engine results on belief in misinformation. By collecting search results using a custom web browser plug-in, the researchers could identify how theย qualityย of these search results may affect usersโ€™ belief in the misinformation being evaluated.

The studyโ€™s source credibility ratings were determined byย NewsGuard, a browser extension that rates news and other information sites in order to guide users in assessing the trustworthiness of the content they come across online.

Across the five studies, the authors found that the act of searching online to evaluate news led to a statistically significant increase in belief in misinformation. This occurred whether it was shortly after the publication of misinformation or months later. This finding suggests that the passage of timeโ€”and ostensibly opportunities for fact checks to enter the information ecosystemโ€”does not lessen the impact of SOTEN on increasing the likelihood of believing false news stories to be true. Moreover, the fifth study showed that this phenomenon is concentrated among individuals for whom search engines return lower-quality information.

โ€œThe findings highlight the need for media literacy programs to ground recommendations in empirically tested interventions and search engines to invest in solutions to the challenges identified by this research,โ€ concludes Joshua A. Tucker, professor of politics and co-director of CSMaP, another of the paperโ€™s authors.

The paperโ€™s other authors included William Godel and Jonathan Nagler of NYUโ€™s Center for Social Media and Politics, and Nathaniel Persily of Stanford Law School.

The study was supported by a grant from the National Science Foundation (2029610).

Fuel Independent Science Reporting: Make a Difference Today

If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resourcesโ€”your support ensures we can keep uncovering the stories that matter most to you.

Join us in making knowledge accessible and impactful. Thank you for standing with us!