"A substantial amount of scholarship in political science has sought to determine whether citizens can participate meaningfully in politics."
So write Brendan Nyhan and Jason Reifler in their article on how we cling to beliefs despite contrary facts. Fact-based reasoning may be the ideal, but we're not idealized creatures.
More below the fold....
Why the Beliefs, Ma'am?
Yesterday we discussed a new article in Political Behavior by Brendan Nyhan and Jason Reifler. The article described their studies showing that media stories including factual corrections do not overturn our preexisting beliefs. Indeed, at least among conservatives in the studies' sample groups, a factual correction may "backfire" and make readers more confident in a false belief. While liberals in those studies did not manifest the "backfire" effect, neither did the factual correction change their beliefs. The studies suggest that, when it comes to beliefs, we don't let facts get in the way.
The obvious conclusion is that we humans don't reason well. Rather than grounding our reasoning in facts and changing our beliefs when facts contradict them, we use what Nyhan and Reifler call motivated reasoning - what I learned as confirmation bias - selecting and interpreting facts to fit our preexisting beliefs. While I don't question their research method or results, I'm skeptical about some of their beliefs.
What is "good reasoning?"
Nyhan and Reifler open their article with this sentence:
A substantial amount of scholarship in political science has sought to determine whether citizens can participate meaningfully in politics.
The unstated assumption is that, at least in political matters, we should base our opinions on facts and change our beliefs when we encounter contradictory facts. That is a common model of good reasoning, but does it describe how most of us think? And if not, is the problem with our thinking ... or with our belief in that model?
In his book The Political Mind, cognitive scientist George Lakoff describes that as the "Old Enlightenment" model of reason: induction and deduction based on facts. In that model, to change someone's opinion you present a factual argument. Once they know the facts, good reasoning will lead them to agree with your position. And if they don't, the problem is their reasoning.
Except, as Lakoff notes, our brains are not wired to think that way. Instead we reason with frames, bundles of linked neurons that embody ideas as narratives, roles, scenes, objectives, strategies, and excuses. We fit our experiences to our existing frames, and the brain must select a frame before we can be aware of "thinking." Often we never get to conscious reasoning; our frames spill out conclusions that we call "intuition."
When we do conscious reasoning, it's often to convince ourselves that our "intuitive," frame-based conclusion is reliable. When a frame spills out a conclusion that about which we're uncertain, we experience early signs of the fight or flight response. That often includes a tightening or sinking feeling in the abdomen, the physiological basis of the phrase "gut feeling." If we're comfortable with a conclusion, or can convince ourselves of it, the anxiety fades and the conclusion passes the "gut check." And as a social species, one primary test of a conclusion is whether others whose opinions we value agree with us.
So what about facts?
Facts are one element of that process, but they don't dominate it. When fact claims don't fit the neural wiring of our frames, or conclusions from those facts don't pass the "gut check" ... we follow our biology and our group, not the facts. Whether that's "good reasoning" is irrelevant. It's how our brains work in Realworldia.
The good news is that our reasoning is not as bad as it seems. Often we can't know all the relevant facts, or not with enough certainty to be confident of a decision. Trusting familiar frames - beliefs - allows us to act on general principles even when we know we don't have enough specific facts. And Realworldia is chaotic enough that an "ideal," fact-based decision may not nudge the odds toward success and/or away from failure much more than a decision with which we're more confident, or a decision which has more group support.
Bandwagons of belief.
The best group decisions marry facts, insofar as they can be known, with shared beliefs. The facts matter, but it is the shared beliefs that enable the group to work together, overcome setbacks and achieve success. To counter false claims, and build support for better decisions, we must base our arguments on deep frames, and the deepest frames are moral values.
To be effective political advocates, we must build bandwagons of belief. In talking with undecided voters, we must first identify shared values. One starting point is the three-sentence Democratic Manifesto we often discuss here in Morning Feature: (1) People matter more than profits; (2) The earth is our home, not our trash can; and, (3) We need good government for both #1 and #2. Those are statements of values, and they are values with which most voters already agree.
Even if a given undecided voter won't agree to those as written, we can usually find similar value-statements on which to agree. Once we identify shared values, we can then explain how the facts fit into those values. That lets motivated reasoning work for us, as the undecided voter will be more open to facts that confirm the values on which we've agreed. And if we can't identify any shared values, no compendium of facts is likely to sway that undecided voter.
We can't "prove" our values are correct. Values are beliefs, not provable facts. But that we reason by values is a fact, and one we progressives must accept. Effective political advocacy must work in Realworldia, and to work in Realworldia we must build bandwagons of belief.
+++++
Your Kossascopes are in today's Campus Chatter.
Happy Friday!
Crossposted from Blogistan Polytechnic Institute (BPICampus.com)