It’s no secret that we’re fans of Skeptical Science’s The Debunking Handbook: we’ve found it to be an invaluable resource to beat back the latest nonsense from deniers. At its core, the Handbook is a cognitive psychology-based rebuttal to the idea that people just need to hear more facts to stop holding incorrect beliefs (an idea known formally as the information-deficit model). Instead, the Handbook embraces a more nuanced approach that emphasizes how communications are structured and formatted.
The bulk of the booklet is made up of tactics for working around a phenomenon known as the backfire effect: a phenomenon some have observed where attempts to bust a myth actually reinforce the belief in its falsehoods.
Last week, Slate published a new feature from Daniel Engber about the trouble scientists have had in replicating the backfire effect in years of follow-up studies. Engber pushes back on the newly-popular idea that we’re in a post-truth age, detailing the years of scientific back-and-forth concerning the backfire effect.
The basic idea of the backfire effect, he explains, is this: content used in attempts to disprove a myth can actually reinforce, instead of erode, the public’s belief in that myth. If a harried, half-distracted reader sees the sentence “global warming is not a hoax,” they are more likely to just remember the main, substantial words global warming hoax, and eventually forget that crucial little not. Like a boomerang, the backfire effect posits, a myth one tries to cast aside comes back and smacks you. At least, that’s what initial research seemed to show. Over over the years, Engber reports, the backfire effect grew into something of a myth itself.
As Engber explains, since multiple studies failed to replicate the boomerang effect, it’s likely the original finding has been oversold, at least to an extent. Particularly when distilled into a comic by the hilarious, awesome and oft-educational Oatmeal. But even when that went viral last year, there was plenty of cause for skepticism.
It’s not exactly breaking news then, that the backfire effect isn’t that huge a deal. Even the frequently-cited original findings were relatively marginal- there was a signal, but not a big one. It was worth taking the backfire effect into account, but it wasn’t near as big a deal as complaints about Snopes debunkings might have made it seem.
But what the studies Engber covers, and indeed the Debunking Handbook itself, fail to account for is that on certain (politically charged) issues, people don’t care about being accurate. These studies tend to assume people operate in a good faith search for truth, and a respect for reality.
What we need instead is some sort of measurement of people’s willingness to engage in bad-faith arguments. A study to design a unit of measurement of bad-faith engagement. Maybe we can call it a Stephen Millergram.
Comments are closed on this story.