How come people don’t believe in science? It turns out science can actually give us answers to that question — well, some of us, anyway. I read an interesting piece about that recently, which I found all the more interesting because of how it relates to fact-checking efforts in politics.
In my recent Fact Check Death Match post, I cited an example of a woman who appeared in a TV ad to complain that Obamacare had increased her health care costs and literally couldn’t believe it when independent researchers told her it wasn’t true. The main explanation I cited was neurological — that people have trouble overwriting a falsehood in their heads if that falsehood gets repeated. But there are crucial psychological reasons why it can be difficult to get people to accept a different factual conclusion as well:
Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation — a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information — and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”
The same thing happens in politics all the time. Even if you avoid repeating the falsehood when refuting something like climate denial, your mere activation of the ‘pro’ argument in a committed denialist’s brain will still bring the denialist’s initial justifications for the anti-science lie to the front of their mind.
Framing your rebuttal in terms of your opponent’s values is usually a more effective way to persuade him than shouting WRONG!
Furthermore, admitting to an error carries moral weight beyond just that simple admission. Obviously no one likes to self-identify as someone who gets things wrong and therefore can’t be trusted. Likewise, if someone encounters evidence disproving a falsehood that is widely embraced among his in-group/tribe, then reaching a different conclusion from the rest of the group is essentially a betrayal of one’s comrades.
But what if you could convince someone that he was wrong without threatening his self identity or his group identity? That would take some of the sting out of it and reduce the psychological burden to updating his views, right?
Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.
…
You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values — so as to give the facts a fighting chance.
There you have it: To convince a conservative, lead with conservative values and conservative validators. Why doesn’t either Mythopedia or the Factivist program just use these conservative framing devices? Probably for the same reason they don’t lead with conservative values and validators in any other situation — they’re liberal, so they don’t want to talk that way, and even if they did, they wouldn’t have much standing with conservatives.
But who else has the vested interest in correcting the record on an issue like, say, climate change, yet isn’t tied to a conflicting ideological position? An independent expenditure campaign of some sort might be our only hope.