The modern information landscape is a jungle of facts and lies. Every day, our feeds push claims about the world’s biggest mysteries. Secret global power plays. Health advice “they” are hiding. It is tempting to grab a simple answer that makes sense of the chaos. But why does one person swallow a sketchy story while another hits the brakes? Intelligence matters less than you’d think. What counts is who we are, what we believe truth actually is, and where we stand politically.
Two people can study identical data about a new tax law or climate policy and reach opposite conclusions. The facts stay the same. The judgment flips. This goes beyond honest disagreement. It is a reflex, a way our minds protect existing worldviews even when logic suffers.
A new doctoral thesis from Linkoping University in Sweden explores why some people fall for misinformation while others don’t. Julia Aspernas from the Department of Behavioural Sciences and Learning surveyed about 2,500 people in Sweden and the U.K., testing how they responded to different types of false content: empty pseudo-wisdom, logical fallacies, conspiracy theories, and science misinformation. The answer surprised her. How easily you’re fooled depends entirely on what kind of lie you’re hearing.
The most striking pattern involved conspiracy theories. Even after carefully balancing political examples to avoid bias, right-leaning participants were more likely to accept and spread these theories than those on the left. The correlation was weak but consistent.
Why? Aspernas isn’t certain. One possibility is simple exposure.
More conspiracy theories circulate in right-wing environments.
Another explanation is psychological. Conservative-minded people tend to scan for threats, which might make them more receptive to narratives about hidden plots. But Aspernas urges caution about reading too much into political divisions.
I don’t know whether it helps us move forward, for example in public debate, by singling out certain groups. And the correlations we see, where right-leaning people stand out, are not very strong.
That caution is well-founded. When researchers tested people’s ability to spot pseudo-profound bullshit (vague nonsense dressed as wisdom), the political divide vanished completely. Left and right were equally susceptible to finding deep meaning in meaninglessness.
When Everyone’s Judgment Fails
The real gut-punch came from testing politically motivated reasoning. Researchers gave participants syllogisms, logical arguments where a conclusion follows from two premises. The task was simple: judge whether the logic was valid, regardless of whether you agreed with the conclusion.
Both sides failed spectacularly when the conclusion matched their political beliefs. They would call a logically flawed argument “sound” just because they liked where it ended up. This is belief bias, and it hit left and right equally hard. The study found that “the tendency to fall for logical fallacies, and the occurrence of politically motivated reasoning in terms of belief bias (…), were similarly distributed across the left-right spectrum.”
When your identity is on the line, your ability to think clearly collapses.
Imagine walking through a loud city market at night. You smell warm spices from a street vendor, hear traffic rumbling. Your political identity works like a mental filter, letting some details through clearly while blocking others completely. It especially blocks logical errors in arguments that flatter your existing views.
The research dug deeper into philosophical territory: epistemic beliefs, or what people think truth actually is. The team measured two forms of truth relativism. Cultural relativism says truth depends on your cultural context. Subjectivist relativism goes further, claiming truth is just whatever feels true to you personally.
That subjectivist view turned out to be the key predictor. People who believed truth was merely subjective experience were more vulnerable to both pseudo-wisdom and conspiracy theories. They were also more likely to spread science misinformation, a tendency that appeared linked to both subjectivist beliefs and right-leaning ideology. This stood in contrast to truth realism, the idea that facts exist independent of human perspective.
The One Thing That Helps
Analytical thinking offered the only consistent defense. People who scored high on reflection and avoided snap judgments spotted logical fallacies better, regardless of their politics. High analytical thinking “seemed to help at least rightists avoid belief bias” when evaluating politically charged arguments.
The takeaway isn’t that one political tribe is smarter. It is that we all stumble badly when information threatens our self-image. Aspernas puts it plainly: we “simply become worse at evaluating information in areas that matter to us, where it affects our self-image.”
But here is where the findings split. For misinformation that actively rejects mainstream facts, conspiracy theories and junk science especially, a specific combination creates vulnerability: right-leaning politics plus the belief that truth is just subjective feeling. Whether that worldview causes susceptibility or merely correlates with it remains unclear. Either way, the fight against fake news turns out to be philosophical at its core, not just about better fact-checking.
The thesis successfully replicated known links between political identity and analytical thinking while adding something new: subjectivist truth relativism as a key vulnerability factor. This gives researchers a stronger framework for understanding how people navigate today’s information chaos.
The study’s ultimate warning is uncomfortable. The best defense against misinformation probably isn’t consuming more facts. It is maintaining constant skepticism about the biases shaping how we see the world. We are most vulnerable not to what others tell us, but to what we’re desperate to believe.
Doctoral Thesis, Linkoping University: 10.3384/9789181183009
ScienceBlog.com has no paywalls, no sponsored content, and no agenda beyond getting the science right. Every story here is written to inform, not to impress an advertiser or push a point of view.
Good science journalism takes time — reading the papers, checking the claims, finding researchers who can put findings in context. We do that work because we think it matters.
If you find this site useful, consider supporting it with a donation. Even a few dollars a month helps keep the coverage independent and free for everyone.
