"Just the facts, Ma'am," Sergeant Joe Friday said on TV's Dragnet.
Or so many believe. He never said that line, but research suggests the media is a poor way to correct the false quote. It may even backfire. We fit facts to our beliefs, and that is troubling news for democracy.
More below the fold....
Just the Beliefs, Ma'am.
Joe Friday never said "Just the facts, ma'am." He did say "All we want are the facts, ma'am" and "All we know are the facts, ma'am." But don't expect a media correction to change the mind of someone who believes otherwise. Research shows we don't like to change our beliefs, or at least not for mere facts.
Indeed contradictory facts may make us cling more stubbornly to our beliefs, a practice termed "backfire" in a new article in Political Behavior by University of Michigan professor Brendan Nyhan and Georgia State University professor Jason Reifler. Their research seems to confirm the Will Rogers quote: "It isn't what we don't know that gives us trouble; it's what we know that ain't so."
Uninformed vs. Misinformed
Nyhan and Reifler build on a long series of studies showing that American citizens are woefully ignorant about political matters. Conventional wisdom says ignorance can be corrected by presenting people with facts, but as is so often the case, that wisdom is at best incomplete. Offering the facts works well when people are uninformed: when they don't know about an issue, and are aware they don't know. But Nyhan and Reifler found uninformed voters are less common than misinformed voters.
That is, people often already have beliefs about many issues, and those beliefs are often based on provably false information. Media stories that correct false claims may correct the opinions of uninformed people, but such stories are not as effective for misinformed people. The more strongly we are committed to our beliefs, the less likely we will respond positively to factual corrections ... and the correction may make us even more committed to a false belief.
The key lies in what Nyhan and Reifler call motivated reasoning, a concept I learned as confirmation bias. Most simply, we more readily accept information that confirms our preexisting beliefs and are more skeptical of information that contradicts our preexisting beliefs. Studies show motivated reasoning is evident in the sources we choose, in whether we challenge a source in discussing a news story, and in how we interpret claims. We don't like to be proved wrong, and our reasoning is motivated in large part by our desire to be right.
The "backfire" effect.
Indeed Nyhan and Reifler found that media stories correcting false claims can have what they call a "backfire" effect: pushing biased readers to cling to false claims even more strongly. Their research used mock news stories where a public figure made a demonstrably false claim.
The stories were based on actual statements by public figures: President Bush's claim that Iraq was developing WMDs until the 2003 U.S. invasion, that his 2001 tax cuts increased tax revenue by stimulating economic activity, and claims that President Bush banned all stem cell research in the U.S. Some stories stated the claim without any challenge, while others included a sourced, factual correction. After reading the mock stories, test subjects were asked a series of related opinion questions.
Not surprisingly, subjects who identified as conservative or Republican more often believed Iraq was developing WMDs (false), that the 2001 tax cuts increased tax revenue (false), and that the stem cell research ban was limited (true). Subjects who identified as liberal more often believed Iraq had no WMD program in 2003 (true), that the 2001 tax cuts did not increase revenue (true), and that the stem cell research ban was complete (false).
The more interesting finding concerned the effect of sourced, factual corrections in the mock news stories. As expected, the correction increased confidence in opinions where the facts confirmed the reader's beliefs. Liberals who read the corrected stories were more confident that Iraq had no WMD program in 2003 and that tax cuts did not increase tax revenues, and conservatives who read the corrected stories were more confident that the Bush stem cell ban was limited.
But where the correction contradicted their beliefs, readers often became even more confident of their (false) opinions. Conservatives who read the corrected stories were more confident that Iraq had WMDs and that tax cuts increased tax revenues than conservatives who read uncorrected stories. The corrections "backfired," increasing their confidence in a false belief.
The study did not find the "backfire" effect in liberals, but neither did the corrections work. Liberals who read the corrected stories were as confident as liberals who read the unchallenged stories that President Bush completely banned stem cell research.
We cling to our beliefs despite contrary facts, and contrary facts in media stories may make us cling to our beliefs even more strongly. We can change our beliefs - we'll talk more about how tomorrow - but merely reading a media correction won't do it.
+++++
Happy Thursday!
Crossposted from Blogistan Polytechnic Institute (BPICampus.com)