Combining two simple tools could combat election misinformation

by Pelican Press
23 views 6 minutes read

Combining two simple tools could combat election misinformation

headline
Credit: Dziana Hasanbekava from Pexels

A popular new strategy for combating misinformation doesn’t by itself help people distinguish truth from falsehood but improves when paired with reminders to focus on accuracy, finds new Cornell University-led research supported by Google.

Psychological inoculation, a form of “prebunking” intended to help people identify and refute false or misleading information, uses short videos in place of ads to highlight manipulative techniques common to misinformation, such as emotional language, false dichotomies and scapegoating. The strategy has already been deployed to millions of users of YouTube, Facebook and other platforms, and could be utilized after the U.S. presidential election.

In a series of studies involving nearly 7,300 online participants, an inoculation video about emotional language improved recognition of that technique—but did not improve people’s ability to discern true headlines from false ones, the researchers found. Participants’ ability to identify true information improved when the video was bookended with video clips prompting them to think about whether content was accurate, suggesting a combined approach could be more effective, the researchers said.

“If you just tell people to watch out for things like emotional language, they’ll disbelieve true things that have emotional language as much as false things that have emotional language,” said Gordon Pennycook, associate professor of psychology. “Encouragingly, we found some synergy between these two approaches, and that means we may be able to develop more effective interventions.”

Pennycook is the first author of “Inoculation and Accuracy Prompting Increase Accuracy Discernment in Combination but Not Alone” in Nature Human Behaviour.

Prior studies involving members of the research team showed that inoculation videos helped people identify manipulative techniques in sample tweets. That raised hopes that a relatively simple intervention could be implemented on a large scale to “immunize” populations against potentially viral misinformation.

The new study investigated whether inoculation’s benefits carried over to more real-world conditions by helping people assess whether information was true or not.

In three initial studies, participants watched the same emotional language video used in the earlier study, which warns viewers to be wary, for example, of headlines referencing a “horrific” accident rather than a “serious” one, or a “disgusting” (versus “disagreeable”) ruling. They then reviewed real headlines—some true, some false—presented in one of two versions the researchers designed: either emotionally neutral or using charged language that could evoke fear or anger.

For example, a true, low-emotion headline read, “NYC wants to ‘end the COVID era,’ declares vaccine as a requirement for its workers.” The evocative version read, “Thousands being forced to take the jab: NYC mandates vaccines for its workers.”

Replicating the earlier work, the less than two-minute inoculation video helped study participants flag manipulative content, particularly in high-emotion headlines. But that didn’t make them better at judging which information was accurate—even in the context most favorable for inoculation, when all false headlines contained highly emotional language, and all true headlines were neutral.

“When the task is made more difficult by intermixing actual true or false claims,” the authors wrote, “the video appears to lose its effectiveness as an ‘inoculation against misinformation.'”

A final pair of studies explored the potential benefits of so-called accuracy prompts—simple reminders about the importance of considering accuracy and the threat of misinformation. Like inoculation, accuracy prompts alone proved ineffective for helping people identify true versus false claims (unlike their past use where they successfully improved the news people share). But when the accuracy prompts were sandwiched around the inoculation video, study participants’ identification of true headlines (but not false ones) improved significantly, by up to 10%.

“This shows that combining two techniques that can be readily deployed at scale can boost people’s skills to avoid being misled,” said Stephan Lewandowsky, professor at the University of Bristol, England, and a co-author of the research.

The results have significant implications for the growing field of designing misinformation interventions, the researchers said, highlighting for industry actors and policymakers the importance of testing and deploying multiple interventions in tandem.

“If you’re going to run these interventions, you should probably begin them with a base reminder about accuracy,” Pennycook said. “Just getting people to think more about whether things are true will carry over—at least in the short term—to what they’re seeing and choices about what they would share online.”

In addition to Pennycook and Lewandowsky, co-authors are Adam Berinsky and David Rand ’04, professors at the Massachusetts Institute of Technology; Puneet Bhargava, a graduate student at the University of Pennsylvania; and Hause Lin, a postdoctoral researcher at MIT.

More information:
Inoculation and accuracy prompting increase accuracy discernment in combination but not alone, Nature Human Behaviour (2024). DOI: 10.1038/s41562-024-02023-2

Provided by
Cornell University


Citation:
Psychological inoculation: Combining two simple tools could combat election misinformation (2024, November 4)
retrieved 4 November 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.




Source link

#Combining #simple #tools #combat #election #misinformation

Add Comment

You may also like