I ran across this in Mother Jones yesterday. It's a very fascinating study, but unfortunately the results seem to effect those who are very passionate partisans (which admittedly I am, of course) on both sides of the political spectrum.
According to a new psychology paper, our political passions can even undermine our very basic reasoning skills. More specifically, the study finds that people who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.
More depressing thoughts below that squiggly doodad.
Here's the gist of the study design:
At the outset, 1,111 study participants were asked about their political views and also asked a series of questions designed to gauge their "numeracy," that is, their mathematical reasoning ability. Participants were then asked to solve a fairly difficult problem that involved interpreting the results of a (fake) scientific study. But here was the trick: While the fake study data that they were supposed to assess remained the same, sometimes the study was described as measuring the effectiveness of a "new cream for treating skin rashes." But in other cases, the study was described as involving the effectiveness of "a law banning private citizens from carrying concealed handguns in public."
And the results?:
The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream. What's more, it turns out that highly numerate liberals and conservatives were even more—not less—susceptible to letting politics skew their reasoning than were those with less mathematical ability.
The study goes on to describe a numerical story problem involving skin cream, to which subjects had to examine closely in and compute ratios to correctly figure out. It revealed that the more "numerate" the subject was, the higher the likelihood that subject would get the correct answer. But then it gets interesting (and sad) - a similar question was given on a more partisan issue involving gun control. The results are quite different:
Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment)—an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).
The opposite was true for highly numerate conservative Republicans: They did just great when the right answer was that the ban didn't work (version D), but poorly when the right answer was that it did (version C).
Bleh. Not good.
The author goes on to state the results tend to refute the so-called "deficit model," i.e. that more information on a particular subject would correctly educate someone and lead them to the right conclusions on that given subject.
But why is this happening? The author suggests:
What's happening when highly numerate liberals and conservatives actually get it wrong? Either they're intuiting an incorrect answer that is politically convenient and feels right to them, leading them to inquire no further—or else they're stopping to calculate the correct answer, but then refusing to accept it and coming up with some elaborate reason why 1 + 1 doesn't equal 2 in this particular instance. (Kahan suspects it's mostly the former, rather than the latter.)
Bummer. So do we give up our passions in order to have more numerate reasoning? I always knew Conservatives had difficulties with basic reasoning skills when presented with factual evidence in their faces that directly contradict their views. This is a bit more humbling to know that passionate liberals like myself can fall victim to this as well.