We Believe Information We Like: A New Take on Confirmation Bias
You’ve perhaps heard of confirmation bias before. It’s one of those scientific psychology terms that made its way into public use. People use confirmation bias to explain all kinds of seemingly irrational ways of thinking.
The essence of confirmation bias is that people try to look for and pay attention to information more when it’s consistent with what they already believe is true. So if someone already thinks that milk is healthy to drink, he’ll pay more attention to pro-milk information than information saying milk isn’t that healthy. This means that people’s thoughts are quite difficult to alter. Confirming information gets incorporated into the thought, and disconfirming information simply doesn’t.
Last year, I read about similar study in psychology demonstrating how people look to discredit information that challenges their views while quickly accepting information that’s consistent with them.
A new research, though, raises a question about confirmation bias and challenges the way it’s usually treated.
Desirability Bias vs. Confirmation Bias
The issue with the existing confirmation bias research is that it usually lumps two essential points together. That is, what people think is true and what people want to be true are typically the same thing in this research. So it’s difficult to say whether people believe some information more because it’s consistent with their current beliefs (“confirmation bias”) or because it’s consistent with what they wish to be true (“desirability bias”).
For instance, people tend to believe they have a lot of good qualities, and they also tend to wish to have good qualities. So if you tell someone, “You’re nice!”, she’s likely to believe it either because it’s consistent with how she sees herself already (confirmation bias) or because it’s consistent with how she wants to see herself (desirability bias).
Believing the Polls in the 2016 Election
To figure out which of these biases is more credible, scientists conducted a research during the last few months of the 2016 United States presidential election. What an interesting time!
Things were tense on all sides. Many polls were demonstrating that Hillary Clinton would win the presidency, but some polls were saying that Donald Trump would win. This made it a fascinating opportunity to pit desirability and confirmation against one another. Although many people believed that Clinton would win, some of those people didn’t want her to win while others did. And a lot of people thought that Trump would win while also wanting he wouldn’t or wanting he would.
So the researchers recruited more than 800 people to participate in an easy experiment. They asked everybody how confident they were that either Clinton or Trump would win the election as well as which candidate they wished to win.
Then everyone read a short article about recent polling numbers, but each participant randomly received one of two versions of the article. One version presented polling data emphasizing that Clinton was likely to win, and the other version gave evidence emphasizing that Trump was likely to win. After reading this, everyone again said how confident they were that either Clinton or Trump would win the election.
To see how this setup helps address our issue, think of one scenario in this study. What happens when a Trump supporter who thinks Clinton will win finds out that the polls point to a Trump victory? A classic confirmation bias advocate would claim that this individual will reject this new information and continue to believe that Clinton will win. After all, the polls contradict the person’s belief. On the other hand, the desirability bias would mean that this individual will accept the new information and update his belief even though it disconfirmed his original belief.
This research demonstrated overwhelming support for the desirability bias, and it occurred for Trump and Clinton supporters alike. People updated their thoughts more when they learned information that was consistent with their desired outcome. In fact, people updated their beliefs the most when the article gave evidence that their preferred candidate would win and disconfirmed their initial belief.
We Believe What We Want to Be True
So it seems that the desirability bias pulls a lot of weight. Even when information disconfirms what we currently believe, if it supports what we want to be true, we’re happy to change our opinion.
Does this mean it’s time to throw away confirmation bias? This recent research seems to demonstrate that what we’ve been calling confirmation bias may have actually been desirability bias all along. But we shouldn’t be too fast to toss out the thought that people can prefer information that coheres with established beliefs.
Although the authors don’t discuss it, I was reminded of the study on self-verification theory. The idea is that when people have low self-esteem, they actually prefer to get negative feedback about themselves because it’s consistent with how they already see themselves. Surely in a case like this, people would desire a more positive self-image, but they are nonetheless motivated by a need to confirm their current self-views. Time will tell how well desirability vs. confirmation biases apply in such scenarios.