New! Sign up for our email newsletter on Substack.

A Handful of Toxic Accounts Are Warping How Americans See Each Other

Spend enough time in social media comment sections and you might conclude that half your neighbors are posting vile garbage. New research shows that’s wildly wrong, but the mistake itself is doing real damage.

Americans believe roughly 43 percent of Reddit users have posted severely toxic comments, according to a study published December 16 in PNAS Nexus. The actual figure from platform data? Three percent. On Facebook, people estimated that 47 percent of users had shared false news; researchers found it was closer to 8.5 percent.

Angela Y. Lee and Eric Neumann at Stanford University, working with Jamil Zaki and Jeffrey Hancock, surveyed more than 1,000 U.S. adults and compared their perceptions to large-scale behavioral data from both platforms. The disconnect wasn’t subtle. Across three studies, participants overestimated toxic behavior by factors ranging from five to thirteen.

Mistaking Volume for Numbers

The misperception stems from a basic attribution error. A small group of prolific accounts generates a disproportionate share of harmful content, following what statisticians call a power-law distribution. On Reddit, that toxic 3 percent produced about a third of all platform content. People encountered the material constantly and assumed it reflected a large, distributed user base rather than a loud minority.

When researchers tested whether participants could identify toxic comments, they performed well. The problem wasn’t recognizing harm when they saw it. It was figuring out who was responsible. Participants estimated that 38 percent of content came from toxic users, nearly matching the actual 33 percent, but they thought it took 38 percent of all accounts to produce it. The math was right; the conclusion was backwards.

“This indicates many people mistake an extremely vocal minority for a somewhat vocal majority, failing to realize that most social media users never post harmful content online,” Lee explains.

That misattribution carries consequences beyond statistics. When people believe half their fellow citizens are behaving badly online, they report feeling more negative emotions and stronger beliefs that American society is in moral decline. They also underestimate how many others want less harmful content, creating a false sense of isolation.

What Happens When You Correct the Record

In a final experiment, researchers showed some participants the actual data about user behavior. Compared to a control group, those who learned the true proportions felt measurably more positive and less convinced that the nation was deteriorating. They also became more aware that their peers shared their desire for better online spaces.

The intervention was simple: accurate information about prevalence. No persuasion, no emotional appeals. Just numbers that contradicted what people had assumed from their daily scrolling.

The study doesn’t downplay the seriousness of online toxicity or misinformation. Both remain genuine problems. But it suggests that current discourse often frames harmful behavior as far more representative than it actually is. Recognizing that most users never post toxic material might not fix the internet, but it could reduce unwarranted cynicism about the people using it.

PNAS Nexus: 10.1093/pnasnexus/pgaf310


Quick Note Before You Read On.

ScienceBlog.com has no paywalls, no sponsored content, and no agenda beyond getting the science right. Every story here is written to inform, not to impress an advertiser or push a point of view.

Good science journalism takes time — reading the papers, checking the claims, finding researchers who can put findings in context. We do that work because we think it matters.

If you find this site useful, consider supporting it with a donation. Even a few dollars a month helps keep the coverage independent and free for everyone.


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.