Isn’t it amazing how when someone is wrong and you tell them the factual, sometimes scientific, truth, they quickly admit they were wrong? Wait, that’s right. They don’t.
Instead, many of us will continue to argue something that simply isn’t true. Are we arguing for the sake of arguing? Or do we truly believe something even after presented with evidence to the contrary?
It’s something that’s been popping up a lot lately thanks to the divisive 2016 presidential election. As a journalist, I see it pretty much every day. (Don’t even get me started on “fake news.”) But some days, it’s just too exhausting to argue the same facts over and over again.
Why Do Facts Not Matter?
So, why, even when presented with logical, factual explanations do people still refuse to change their minds?
A group of researchers at Dartmouth College wondered the same thing. Instead of just arguing with family and friends, they went to work. They began studying the “backfire effect,” which they define as a phenomenon by which “corrections actually increase misperceptions among the group in question,” if those corrections contradict their views. Why? “Because it threatens their worldview or self-concept,” they wrote.
The Dartmouth researchers found, by presenting people with fake newspaper articles, that people receive facts differently based on their own beliefs. If the source of the information has well-known beliefs (say a Democrat is presenting an argument to a Republican), the person receiving accurate information may still look at it as skewed. “But I know where she’s coming from, so she is probably not being fully accurate,” the Republican might think while half-listening to the Democrat’s explanation.
Here’s how the Dartmouth study framed it:
People typically receive corrective information within ‘objective’ news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source. In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions—a view that is consistent with a wide array of research.
So, basically, when hearing information, we pick a side and that, in turn, simply reinforces our view. That means—even when presented with facts—our opinion has already been determined and we may actually hold that view even more strongly to “fight” back against the new information. When Kellyanne Conway coined the term “alternative facts” in defense of the Trump administration’s view on how many people attended the inauguration, this phenomenon was likely at play.
“If people counterargue unwelcome information vigorously enough, they may end up with ‘more attitudinally congruent information in mind than before the debate,’ which in turn leads them to report opinions that are more extreme than they otherwise would have had,” the Dartmouth researchers wrote.
This tendency to embrace information that supports a point of view and reject what does not is known as the “confirmation bias.” There are entire textbooks and many studies on this topic if you’re inclined to read them, but one study from Stanford in 1979 explains it quite well.
Researchers used a group of students who had different opinions on capital punishment. Some students believed it deterred crime, while others said it had no effect. The students were provided with fake studies for both sides of the argument. As you’ve probably guessed by now, those who supported capital punishment said the pro-deterrence data was highly credible, while the anti-deterrence data was not. The opposite was true for those who opposed capital punishment.
At the end of the study, the students who favored capital punishment before reading the fake data were now even more in favor of it, and those who were already against the death penalty were even more opposed. Surprised? Probably not.
What Can We Do If Facts Don’t Matter?
How can we avoid losing our minds when trying to talk facts?
Julia Galef, president of the Center for Applied Rationality, says to think of an argument as a partnership.
“Instead of thinking about the argument as a battle where you’re trying to win, reframe it in your mind so that you think of it as a partnership, a collaboration in which the two of you together or the group of you together are trying to figure out the right answer,” she writes on the Big Think website.
In an interview with NPR, one cognitive neuroscientist said, for better or for worse, it may be emotions and not facts that have the power to change our minds. For example, our opinions on military spending may be fixed—despite the presentation of new facts—until the day our son or daughter decides to enlist.
If you’re not interested in trying anymore and have given up on defending the facts, you can at least find some humor in it, right?