In my day job I'm a neuroscientist. Usually this does not intersect with politics, but today's an exception.
In June, my book co-author Sandra Aamodt and I wrote for the New York Times about how our brains lie to us, allowing the formation of false beliefs. Examples of false beliefs include rumors about Barack Obama's religion, or about John McCain fathering a mixed-race child. We didn't realize at the time just how relevant our topic would be in this year's campaign.
Dan Froomkin asked us if brain science could be translated into practical lessons for journalists on how to prevent false belief formation. The answer is yes. So we wrote a piece for him that outlines four principles to guide journalists.
(cross-posted at the Princeton Election Consortium)
In Nieman Watchdog we have an article that makes the general point that "Journalists should avoid presenting both sides of a story when one is false - and take into account how readers' brains process the disagreements."
We cite research that finds that
The human brain...does not save information permanently, as do computer drives and printed pages. Recent research suggests that every time the brain recalls a piece of information, it is "written" down again and often modified in the process. Along the way, the fact is gradually separated from its original context. For example, most people don't remember how they know that the capital of Massachusetts is Boston.
This phenomenon, known as source amnesia, leads people to forget over time where they heard a statement - and whether it is true. A statement that is initially not believed can gain credibility during the months that it takes to reprocess memories from short-term to longer-term storage. As the source is forgotten, the message and its implications may gain strength....
In [one] Stanford study, students were exposed repeatedly to the unsubstantiated claim that Coca-Cola is an effective paint thinner. Those who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than the National Enquirer), giving it a gloss of credibility. Thus the classic opening line "I think I read somewhere," or even reference to a specific source, is often used to support falsehoods.
And here's the corresponding lesson for journalists:
1. State the facts without reinforcing the falsehood. Repeating a false rumor can inadvertently make it stronger. In covering the controversy over a New Yorker cover caricaturing Barack and Michelle Obama, many journalists repeated the charges against the candidate - often citing polling data on how many Americans believe them - before noting that the beliefs were false. Particularly damaging is the common practice of replaying parts of an ad before debunking its content.
Here is some more background:
...psychologist Daniel Gilbert and his colleagues have shown that if people are distracted from thinking critically, they default to automatically accepting statements as true....ideas can spread by emotional selection, rather than by their factual merits. Memory formation is aided by the universal emotions of fear and disgust. Moral disgust played a role in 2000, when Bush campaign operatives spread false rumors that Senator John McCain had fathered a mixed-race child, damaging McCain’s support among southern Republican primary voters.
which leads to the following principle:
2. Tell the truth with images. Nearly half of the brain is dedicated to processing visual information. When images do not match words, viewers tend to remember what they see, not what they hear. Karl Rove has said that campaigns should be run as if the television's sound is turned down.
Television journalists should avoid presenting images that contradict the story....[one] recent story featured a threatening swarthy face subtitled "Obama the Antichrist?" - a statement that CNN would presumably not claim to be true.
The other two principles and more science can be found by reading the whole thing.