As someone with a long history of loving science, it can be infuriating to see people cast it aside in favor of what they’d prefer to believe. When evidence is staring you in the face, can you really opt to believe in a world that would make that evidence impossible? Science denial is troubling to scientists of all sorts, whose years of work can be brushed aside like vegetables on a dinner plate.
But who’s the biggest offender? Who denies science like it’s their job? Well, plenty have made sideways glances at the politically conservative, accusing them of deep science denial.
After all, it seems that some major cases of science denial have consistently shown up in conservative talking points. For example, climate change denial and evolution denial have settled neatly into many conservative positions even though both are at odds with mountains of scientific research.
So is science denial a conservative’s game? Maybe there’s just something about a conservative ideology that renders science untrustworthy and yucky. More liberal minded people may champion science, never questioning its insights like their conservative counterparts do.
Or are both sides guilty?
Seeing What You Want to See
I’ve written before about “motivated reasoning,” which means people believe what they want to believe. We all have a sense of what we believe, like, and value, and rather than bother our brains to rethink those things, it’s easier to assume that new information is already nicely consistent with where we already stand.
If motivated reasoning is pervasive, then there’s little reason to think liberals would be immune. Liberals have beliefs and opinions just like anyone else and may also prefer to hang onto them even if new information poses a challenge.
That’s at least what Anthony Washburn and Linda Skitka wondered before running a big study to see whether motivated reasoning and science denial are restricted to the political right or are instead present across the spectrum.
A Study About Studies…
Washburn and Skitka’s approach was to come up with plausible-sounding scientific studies that relate to a handful of hot button issues–things like health care, immigration, and same-sex marriage. Since these studies were made up, the researchers could move some of the “results” around to make the same study support one position or the other.
For example, in one of these fabricated studies, researchers looked at hospitalization rates in cities that had or hadn’t passed universal health care laws. Depending on how the numbers were arranged, the study could “prove” that universal health care decreased hospital admissions or that it increased them.
So people in the experiment would read about one of these scientific studies, but one extra piece was critical. Even though the studies people read about had results that objectively supported one position on the issue, the results were a little confusing to understand. The way the results were presented, however, was deceptive. At a glance, it would seem like the study supported one position. It’s only if you think a little harder that you realize what the results actually mean.
Working Hard to Prove Yourself Right
It may seem complicated, but the researchers had set up a situation where getting the right answer took a little extra work. So when someone interprets the study they saw correctly, we know they weren’t fooled by the wrong interpretation that jumps out first. The researchers, then, looked at whether people’s initial opinions guided how much work they put into getting the right answer.
When the easy interpretation supported people’s pre-existing opinions, they called it a day. They didn’t think more deeply because they were satisfied with what the results appeared to show. That is, people came to the easy (incorrect) interpretation of the scientific research if that was the interpretation that agreed with their own opinion.
It’s when that first, easy interpretation challenges your opinion that you’re driven to dig deeper and discover the true meaning of the study–the meaning that agrees with you! In other words, people kept thinking about the study until they found a way to see it as supporting the opinion they already had.
Later, people would learn about the correct way to interpret the study they read about, straight from the “scientists” themselves. If this (objectively correct) interpretation was still at odds with people’s own position on the issue, they were less likely to think the scientist was knowledgeable and trustworthy.
I Thought This Was About Ideology?
Good memory. This whole thing started because we wondered whether science denial was especially prevalent among more conservative-leaning people. All of the participants in this study had also indicated their political leanings, but the results were the same for liberals as they were for conservatives. There wasn’t any evidence that these biased interpretations of research or denials of scientists’ credibility was restricted to conservatives. Liberals were just as likely to go down that path.
In the end, what matters is whether the science supports or challenges your pre-existing opinion.
The study used “scientific research” that supported either position on issues of health care, gun control, nuclear energy, immigration, etc. And yes, political ideology was related to people’s opinions on those issues, but if the scientific results opposed people’s opinions–whatever they were–then people were biased to think that the study meant just the opposite.
It can seem like conservatives are more critical of science because the cases that get the most attention are when science challenges predominantly conservative beliefs (e.g., climate change). But if science were to challenge liberal beliefs (e.g., showing that same-sex marriage caused mental health problems), liberals would be just as motivated to find flaws in the research and find ways to interpret it in more favorable ways. People are people. And people are biased.