New! Sign up for our email newsletter on Substack.

AI News Bias Tool Created By Computer Scientists

Employs Artificial Intelligence to Develop News “Nutrition Label”

USC computer scientists have developed a tool to automatically detect bias in news.

The work, which combines natural language processing and leverages moral foundation theory to understand the structures and nuances of content that are consistently showing up on left-leaning and right-leaning news sites, was presented at the International Conference on Social Informatics in the paper “Moral Framing and Ideological Bias of News.”

The researchers intend to help consumers understand viewpoints that are being presented when they are unfamiliar with a news source.

“Our ultimate goal was to grade news to create a news bias ‘nutrition label’ for when you are consuming news,” says Kristina Lerman a Principal Scientist at USC Information Sciences Institute and the corresponding author for the study.

While the scholars would like want to quantify the percentage of content within an article leaning in a particular way, right now they are quantifying the site’s content in aggregate.

The algorithms did not just search for key words. Rather, it looks for more complex patterns in the contextual use of language to evoke key themes or frames. This demonstrates the complexity of the AI algorithms to recognize and understand human language and nuance.

The algorithm, using categories defined in the domain of moral foundation theory, found that content posted on left-leaning sites focused on fairness and equity; on right-leaning sites, the content was couched in moral purity under fear of contamination as well as an appeal to law and order. Authors plan to build on this work to understand ideological differences in the framing of news and other communications.

To the authors, such tools would help contextualize the content we read.  As of now, articles, says Lerman, are very coded right now and it is not easy for people to decipher.

The work is on a parallel track to prior research by Lerman which looks at the polarization that is exacerbated in the echo chambers on social media.  One can imagine that the articles that are being shared in these echo chambers –if shared without context and further serve to polarize.

Lerman says by doing this, “You start to see cultural shifts.”

Fuel Independent Science Reporting: Make a Difference Today

If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resources—your support ensures we can keep uncovering the stories that matter most to you.

Join us in making knowledge accessible and impactful. Thank you for standing with us!