Suppose a close friend who is trying to figure out the facts about climate change asks whether you think a scientist who has written a book on the topic is a knowledgeable and trustworthy expert. You see from the dust jacket that the author received a Ph.D. in a pertinent field from a major university, is on the faculty at another one, and is a member of the National Academy of Sciences. Would you advise your friend that the scientist seems like an “expert”?
If you are like most people, the answer is likely to be, “it depends.” What it depends on, a recent study found, is not whether the position that scientist takes is consistent with the one endorsed by a National Academy. Instead, it is likely to depend on whether the position the scientist takes is consistent with the one believed by most people who share your cultural values.
This was the finding of a recent study conducted by Yale University law professor Dan Kahan, University of Oklahoma political science professor Hank Jenkins-Smith and George Washington University law professor Donald Braman that sought to understand why members of the public are sharply and persistently divided on matters on which expert scientists largely agree.
“We know from previous research,” said Dan Kahan, “that people with individualistic values, who have a strong attachment to commerce and industry, tend to be skeptical of claimed environmental risks, while people with egalitarian values, who resent economic inequality, tend to believe that commerce and industry harms the environment.”
In the study, subjects with individualistic values were over 70 percentage points less likely than ones with egalitarian values to identify the scientist as an expert if he was depicted as describing climate change as an established risk. Likewise, egalitarian subjects were over 50 percentage points less likely than individualistic ones to see the scientist as an expert if he was described as believing evidence on climate change is unsettled.
Study results were similar when subjects were shown information and queried about other matters that acknowledge “scientific consensus.” Subjects were much more likely to see a scientist with elite credentials as an “expert” when he or she took a position that matched the subjects’ own cultural values on risks of nuclear waste disposal and laws permitting citizens to carry concealed guns in public.
“These are all matters,” Kahan said, “on which the National Academy of Sciences has issued ‘expert consensus’ reports.” Using the reports as a benchmark,” Kahan explained that “no cultural group in our study was more likely than any other to be ‘getting it right’,” i.e. correctly identifying scientific consensus on these issues. They were all just as likely to report that ‘most’ scientists favor the position rejected by the National Academy of Sciences expert consensus report if the report reached a conclusion contrary to their own cultural predispositions.”
In a separate survey component, the study also found that the American public in general is culturally divided on what “scientific consensus” is on climate change, nuclear waste disposal, and concealed-handgun laws.
“The problem isn’t that one side ‘believes’ science and another side ‘distrusts’ it,” said Kahan referring to an alternate theory of why there is political conflict on matters that have been extensively researched by scientists.
He said the more likely reason for the disparity, as supported by the research results, “is that people tend to keep a biased score of what experts believe, counting a scientist as an ‘expert’ only when that scientist agrees with the position they find culturally congenial.”
Understanding this, the researchers then could draw some conclusions about why scientific consensus seems to fail to settle public policy debates when the subject is relevant to cultural positions.
“It is a mistake to think ‘scientific consensus,’ of its own force, will dispel cultural polarization on issues that admit scientific investigation,” said Kahan. “The same psychological dynamics that incline people to form a particular position on climate change, nuclear power and gun control also shape their perceptions of what ‘scientific consensus’ is.”
“The problem won’t be fixed by simply trying to increase trust in scientists or awareness of what scientists believe,” added Braman. “To make sure people form unbiased perceptions of what scientists are discovering, it is necessary to use communication strategies that reduce the likelihood that citizens of diverse values will find scientific findings threatening to their cultural commitments.”
The Journal of Risk Research published the study online today. It was funded by the National Science Foundation’s division of Social and Economic Sciences.