Tech fixes can’t protect us from disinformation campaigns

More than technological fixes are needed to stop countries from spreading disinformation on social media platforms like Facebook and Twitter, according to two experts.

Policymakers and diplomats need to focus more on the psychology behind why citizens are so vulnerable to disinformation campaigns, said Erik Nisbet and Olga Kamenchuk of The Ohio State University.

“There is so much attention on how social media companies can adjust their algorithms and ban bots to stop the flood of false information,” said Nisbet, an associate professor of communication.

Tech fixes can’t protect us from disinformation campaigns

Erik Nisbet

“But the human dimension is being left out. Why do people believe these inaccurate stories?”

Russia targeted American citizens during the 2016 election with posts on every major social media platform, according to reports produced for U.S. Senate investigators.

This is just one example of how some countries have distributed “fake news” to influence the citizens of rival nations, according to the researchers.

In an invited paper just released in The Hague Journal of Diplomacy, Nisbet and Kamenchuk, a research associate at Ohio State’s Mershon Center for International Security Studies, discussed how to use psychology to battle these disinformation campaigns.

“Technology is only the tool to spread the disinformation,” Kamenchuk said.

“It is important to understand how Facebook and Twitter can improve what they do, but it may be even more important to understand how consumers react to disinformation and what we can do to protect them.”

The researchers, who are co-directors of the Mershon Center’s Eurasian Security and Governance Program, discussed three types of disinformation campaigns: identity-grievance, information gaslighting and incidental exposure.

Identity-grievance campaigns focus on exploiting real or perceived divisions within a country.

“The Russian Facebook advertisements during the 2016 election in the United States are a perfect example,” Nisbet said. “Many of these ads tried to inflame racial resentment in the country.”

Another disinformation strategy is information gaslighting, in which a country is flooded with false or misleading information through social media, blogs, fake news, online comments and advertising.

A recent Ohio State study showed that social media has only a small influence on how much people believe fake news. But the goal of information gaslighting is not so much to persuade the audience as it is to distract and sow uncertainty, Nisbet said.

Tech fixes can’t protect us from disinformation campaigns

Olga Kamenchuk

A third kind of disinformation campaign simply aims to increase a foreign audience’s everyday, incidental exposure to “fake news.”

State-controlled news portals, like Russia’s Sputnik, may spread false information that sometimes is even picked up by legitimate news outlets.

“The more people are exposed to some piece of false information, the more familiar it becomes, and the more willing they are to accept it,” Kamenchuk said. “If citizens can’t tell fact from fiction, at some point they give up trying.”

These three types of disinformation campaigns can be difficult to combat, Nisbet said.

“It sometimes seems easier to point to the technology and criticize Facebook or Twitter or Instagram, rather than take on the larger issues, like our psychological vulnerabilities or societal polarization,” he said.

But there are ways to use psychology to battle disinformation campaigns, Kamemchuk and Nisbet said.

One way is to turn the tables and use technology for good. Online or social-media games such as Post-FactoBad News and The News Hero teach online fact-checking skills or the basic design principles of disinformation campaigns.

Because campaigns to spread false information often depend on stoking negative emotions, one tactic is to deploy “emotional dampening” tools. Such tools could include apps and online platforms that push for constructive and civil conversations about controversial topics.

More generally, diplomats and policymakers must work to address the political and social conditions that allow disinformation to succeed, such as the loss of confidence in democratic institutions.

“We can’t let the public believe that things are so bad that nothing can be done,” Kamenchuk said.

“We have to give citizens faith that what they think matters and that they can help change the system for the better.”


The material in this press release comes from the originating research organization. Content may be edited for style and length. Have a question? Let us know.

Subscribe

One email, each morning, with our latest posts. From medical research to space news. Environment to energy. Technology to physics.

Thank you for subscribing.

Something went wrong.

1 thought on “Tech fixes can’t protect us from disinformation campaigns”

  1. The whole discussion of “fake news” is tainted, when those protesting the loudest are also the ones promoting false narratives. The NYTimes promoted lies that got us into the Iraq War, and is now lying to us about Julian Assange and about “Russian interference”. The above article cites “US Senate investigators” as if they were a reliable, unbiased source of truth.

    The idea that people are naive and need to be protected from lies is a dangerous meme, gaining currency in the divisive climate of American politics. Indeed, Wikipedia and Facebook and Google are already steering people away from “conspiracy theories”, some of which are true. Our Constitution wisely protects us from government censorship, but failed to anticipate that a few corporations could dominate an entire nation’s information source, and can accomplish censorship without sanction from the Federal government.

    Assange is in jail and his revelations are being censored, not because they’re “fake news” but because they reveal crimes and atrocities by the most powerful institutions in the world. American GIs gleefully gunning down medics in an ambulance from their helicopter. Torture in Abu Grahib. Fixing of the 2016 Democratic primary by DNC insiders.

    The bottom line is that distinguishing what is true is too important a task to be left to any one company or one institution. Only the people can decide what to believe, and the more diverse their information, the more effective they will be. Censorship of the news was a bad idea in 1789, and it’s a bad idea today.

    Reply

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.