Let's talk about “Motivated Reasoning” for a moment. Haven't you ever wondered why Uncle Harry and Aunt Gladys don't 'get it' when it comes to certain so-called truths? In spite of all the information you throw at them they still don't budge in their hard-held beliefs.
I remember a discussion I had with my Uncle Alvin. I was a junior in college and we were discussing some – at that time – debatable issue. I pointed out that such-and-such was true and he, Uncle Alvin, said “no it ain't.” So, to illustrate my point and prove my uncle wrong, I hit the college library and picked up a copy of the Encyclopedia Britannica and ear-marked the passage that proved my point. I showed it to Uncle Alvin with a gloating flourish and he said, “Encyclopedia Britannica? What do those Limies know about this deal?” At that point, I closed the book, left him sitting in the living room and drove back to my dorm room shaking my head from side to side with wonderment at my uncle's mule-hardheadedness. What I didn't know at the time is what I'm describing to you below: Motivated Reasoning.
Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
For instance, if I don't want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn't too emotionally invested to accept it, anyway. That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals besides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.
If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it's an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you're a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.
That may be why the selectively quoted emails of Climate-gate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climate-gate, the emails were a rich trove of new information upon which to impose one's ideology.
It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?
There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolution-ism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.
Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are "system justifiers": They engage in motivated reasoning to defend the status quo.
You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a "culture war of fact." In other words, paradoxically, you don't lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.