Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
- Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
- Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
- Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
- Propaganda targets emotional reactions, not cognitive reasoning. [1]
- Counterpropaganda must target emotions as well as reason. [1]
- Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
- Continued Influence Effect
- Familiarity Backfire Effect
- Overkill Backfire Effect
- Worldview Backfire Effect
This the third of four posts pertaining to REASON - Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation - Worldview and Backfire. The human ability to believe lies and to maintain such belief in the face of contrary evidence lies at the root of the problem of propaganda and counterpropaganda. The amount of material necessary for a useful explanation is far too much for a single blog.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies:
The Nine Principles of Propaganda and Counterpropaganda — HERE.
The Twelve Criteria of Psychopathy — HERE.
The Forty Most Common Logical Fallacies — HERE.
Concise Recommendations for Dealing with Misinformation — HERE.
Cognitive Biases, Misinformation and the Brain
Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor was cited in our prior posting Reason #8b - Cognitive Biases. We repeat it here because it gives a succinct and clear explanation of an issue of critical importance to this topic. It appeared in response to a Los Angeles Times article "Measles is deadly; vaccines are not," (2016 Feb. 10). [Emphasis added.]
Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.
When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).
In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.
A wise and wary frontal cortex knows all too well how wrong can feel so right.
Dale O’Neal, Clinical Psychologist, 16 February, 2019
Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it's all false. It's the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka "vigilant frontal cortex") can defeat the amygdala when it's in full fear mode.
Misinformation and its Correction [2]
The following is a condensation of an important metastudy, “Misinformation and its Correction: Continued Influence and Successful Debiasing.” Lewandowsky, Ecker, Seifert, Schwarz & Cook. (2012 Sep 17). Pgs 14-18 (Link to printable copy) [2]
Assessing the Truth of a Statement
Lies carry no warning labels. We usually cannot detect lies or errors until they are later retracted or corrected. In everyday conversation we normally automatically default to the assumption that what we hear is true, clear and relevant. Research studies “…suggest that comprehension of a statement requires at least temporary acceptance of its truth before it can be checked against relevant evidence. On this view, belief is an inevitable consequence—or indeed precursor—to comprehension.” [Emphasis added] [2
Is it compatible with what I believe?
Compatibility promotes acceptance. New information that fits with previously accepted information “feels right;” we accept it and file it with the rest. From then on, it becomes as highly resistant to change as was the previously accepted information. Rejecting it would involve rejecting the previously accepted and now-resistant information. Statement acceptance increases when printed in high color contrast and an easy-to-read font, when it rhymes, and when spoken in a familiar accent.
Good, coherent stories leave no gaps and are easily remembered.
They are highly resistant to change...
Is the message coherent?
The message should fit well within a sensible, broader story. Studies on mental models and on jury decision making have shown this to be especially true when the message cannot be assessed in isolation because it depends on other, related pieces of information. A compelling story organizes the available information, lacks internal contradictions, and is compatible with our common assumptions about human motivation and behavior. Good, coherent stories leave no gaps and are easily remembered. They are highly resistant to change because each element is supported by the other elements; altering one element alters the whole, making it less plausible. We understand and remember coherent stories more easily than incoherent stories; this ease-to-remember serves to make them more believable and more resistant to change.
Is the source credible?
A source’s credibility increases in importance as the listener’s motivation and understanding declines, and the more credible the source, the more we are persuaded. Yet non-credible sources can still influence because people are often insensitive to context: testimony under oath is no more believed than testimony not under oath; studies funded by cigarette producer Corp. Z, are as persuasive as studies from an independent commission with nothing to gain or lose. The gist of an interesting, coherent message is remembered long after its source is forgotten; good stories from untrustworthy sources will be remembered far longer than poor stories from credible sources. Mere repetition of a name, place or even an idea makes it more familiar, and hence more credible, even when it’s the same source repeating it over and over again. Even when a message was initially rejected, it may be accepted at a later time simply because it has become familiar.
Do others believe it?
Repetition increases acceptance. One 1945 study showed that repetition was the strongest predictor of belief in wartime rumors, possibly because it creates the illusion of social consensus. When others believe it, we feel it’s probably true because we hear common, likely messages far more often than weird unlikely messages. (We evolved to harbor that expectation.) Such familiarity usually indicates social consensus, our innate bias tells us.
Even when information has become familiar for poor reasons such as mere repetition of the same statement by the same source, the more often people hear or read it, the more they will believe it is widely accepted. Thus a single repetitive voice is treated as if it were a chorus. The “echo chamber” of social media networks creates such a chorus and their repetition is especially influential, which explains the explosive proliferation of Russian bots, trolls and fake news. This can lead to “pluralistic ignorance,” the divergence between how common a belief actually is and how common we think it is. For example, during the lead-up to the 2003 invasion of Iraq, the majority of American who wanted multilateral intervention believed themselves to be in the minority because of the unilateral interventionists’ dominance of the American media, while the actual minority of unilateral interventionists falsely believed they were the majority. A 2008 study showed that Australians with strongly negative attitudes towards Aboriginals or asylum-seekers over-estimated support for their attitudes by 67% and 80%, respectively. The 1.8% of people in the sample with strongly negative attitudes towards Aboriginal Australians thought that 69% of all Australians (and 79% of their friends) agreed with them.
Such false social consensus can solidify and maintain belief in misinformation. How best to correct such misinformation? Correcting faulty truth assessments involves a competition between the perceived truth value of misinformation and correct information. Ideally the correction will undermine the perceived truth of misinformation and enhance the acceptance of correct information. But such corrections often fail to work as expected due to the presence of four cognitive problems, beginning with the Continued Influence Effect.
The Continued Influence Effect:
Retractions Fail to Eliminate the Influence of Misinformation
Continuing the condensation of “Misinformation and its Correction" Pgs 18-23 [2]
Numerous studies show that retractions are ignored and original misinformation is remembered even when the readers have no motivation to believe either. A common test narrative involves a warehouse fire initially thought to have been caused by gas cylinders and oil paints negligently stored in a closet. Readers then read this retraction: “The closet was actually empty.” Others see no retraction. When asked in subsequent testing, “What caused the black smoke?”, “Did you see a retraction?,” among other questions, those who did read the retraction continue to rely on the initial misinformation, even when they believed and understood the retraction and could later recall the retraction. At best such reliance on misinformation was reduced by 50%; in many studies, the retraction had no effect whatsoever.
Retractions and corrections in the media have even less effect, whether they immediately follow the initial report or at a later date. In the studies, clarifications of the misinformation – “paint and gas were never on the premises” – backfired; people became even more likely to rely on the misinformation. While some additions to the correction helped – “a truckers’ strike prevented the expected delivery of the items” – continued reliance on the misinformation could still be detected. Numerous additional studies have yielded the same results.
One possible explanation has to do with the mental models we create of unfolding events. In the warehouse scenario, negligence led to improper storage of flammables, followed by an electrical fault igniting the materials. When the negligence and inflammable materials items are retracted, a hole is left in the mental narrative, and the narrative no longer “makes sense” unless the false assertion is retained. When questioned, study participants continued to respond with the misinformation despite being aware of the correction. If they are asked to explain why the misinformation might be true, it then becomes even more difficult to correct.
Studies show that people fill gaps in episodic memory with available inaccurate but “fitting” information. It may be that a complete but inaccurate model “feels” preferable to a model which is correct but incomplete. It “feels better” to stick to the original and coherent (but incorrect) narrative than to feel discomfort caused by a true-but-incoherent narrative.
Memory retrieval failure is another possible explanation. When valid and invalid memories compete for automatic activation, they might not be properly sorted. Questioning may activate misinformation when the misinformation supplies a plausible account of an event. The person would then need to actively think about which memories were true and which were false in order to sort them out, an activity uncomfortable to do.
Thirdly, there is some evidence that retraction processing is like attaching a “Negation Tag” to a memory entry (e.g., “there were oil paints and gas cylinders—NOT.” Such mental tags can be lost, leaving only the misinformation behind. If this is true, negations should be more successful when they are an affirmation of an alternate attribute. “Jim is tidy” works better than “Jim is not messy” when negating the original message of “Jim is messy.” But “Jim is charismatic” has no such alternative other than “not charismatic.” People will replace “messy” with “tidy” far more reliably than replacing “charismatic” with “not charismatic.”
In another view, the effect of fluency (smooth processing of the information during a later re-exposure) can cause misinformation – even when not recollected – to increase perceived familiarity and coherence of a narrative. If so, retractions which repeat the misinformation can fail or backfire because they already “feel familiar.” Thus, reading “no paints and gas were present” reinforces the familiar misinformation that “paints and gas were present.” The retraction then becomes a repetition of the misinformation and makes the memory even stronger. “I heard that before, so there’s probably something to it.” Thus, because retractions repeat the misinformation, they may backfire by increasing familiarity and fluency of processing of the misinformation.
Such fluency-based effects make difficult the correction of misinformation. “Myth vs. Fact” approaches are especially ineffective. Immediately after reading a hand-out from the U.S. Center for Disease Control, readers correctly identified the myths and the facts. Only 30 minutes later, they identified more myths as facts than people who had never read the handout at all, and they based their future plans (to refuse inoculations) on their erroneous recollections. Older adults and children are most susceptible to such fluency-based backfire effects; they are also the most likely to accept explicit messages that the information is false.
When corporations pretend to be associated with events such as the Olympic Games – known as “ambushing” – not only is the ambush successful, but attempts to expose them with counter-ambushing usually backfires and leads people to believe in the association even more.
Social Reactance can cause retractions to be ineffective. Many people don’t like being told what to think and how to act and may reject authoritative retractions. Numerous studies have presented mock jurors with evidence later ruled inadmissible. When jurors are asked to disregard the tainted evidence, they show higher conviction rates when an “inadmissible” ruling is accompanied by a judge’s extensive legal explanations than when the inadmissibility was left unexplained.
Reducing Misinformation’s Impact
Continuing the condensation of “Misinformation and its Correction" Pgs 23-27 [2]
Pre-exposure warnings
Explicit up-front warnings that people are about to encounter misleading information can significantly reduce the influence of misinformation. One study found that such warnings must specifically explain the continuing influence of misinformation, not just mention its presence. This can be applied in advertising, in pseudoscientific or historical fiction, and especially in court where jurors often hear information they are later instructed to disregard. Because our default behavior is to assume presented information is valid, warnings must be pre-exposure to allow the recipients to “tag” it as suspect – afterwards is too late. Early warnings may work by inducing a temporary state of skepticism which helps to increase our ability to discriminate between true and false information.
Repeated Retractions
When misinformation is repeated, repetition of retraction can alleviate, but not eliminate, misinformation’s effects. However, the effect of a single presentation of misinformation persisted just as strongly after three retraction repetitions as it did after a single retraction. Misinformation effects are extremely hard to eliminate or drive below a base level of “irreducible persistence,” however strong the retraction(s). Several explanations for this phenomenon have been offered.
(1). Initial misinformation may be automatically processed into memory. Because memory change or elimination requires conscious activity; the person must be aware of the misinformation’s automatic effect on their reasoning.
(2). When single misinformation presentations are “tagged – not” by a retraction, additional retractions don’t increase the strength of the “tag.” Because misinformation effects are strengthened by repetition, repetition of retractions can help by increasing the “tag – not” strength.
(3). A “methinks they protest too much” effect may result from repetitions of corrections.
(4). When misinformation is repeated within the retraction, it can backfire and increase acceptance of the misinformation.
Repetition of misinformation has a stronger and more reliable negative effect than the positive effect of repetition of retractions. This is especially unfortunate for social networks, where lies and propaganda quickly spread but corrections do not.
Filling the Gap – Providing an Alternative Account
In studies using the “gas, oil paints and warehouse fire” or equivalent scenario, it was found that retractions such as “there was no negligent storage of gas and oil paint in a closet” did not work, probably because they left holes in the narrative. Alternative explanations – “arson materials were found ” did work, probably by filling the narrative gap. Such alternative explanations must be plausible, account for the initial narrative’s main points and, ideally, explain why the initial misinformation was thought correct. Explanations of the motivation behind the incorrect report are particularly successful. “Someone overheard someone guessing at the cause and mistook it for fact.” Merely mentioning an alternative scenario will not reduce reliance on misinformation; it has to be solidly integrated into the explanation. Simple explanations are best; people will reject correct but complex explanations and cling to the wrong but simple explanation. Providing too many counter-arguments can backfire through “overkill.”
In political misinformation and corrections, where lies often abound and everyone’s motives are suspect, people often place more suspicion on explanations from some sources.
In conclusion, there are three established techniques to reduce the continued influence of misinformation: Pre-exposure Warnings, Repeated Retractions, and – most effective – Alternative Explanations that fill the narrative gap. Unfortunately it may take time to determine the correct explanation, and time is often critically short.
This is the eighth installment (part c) in our series on counterpropaganda.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies:
The Nine Principles of Propaganda and Counterpropaganda — HERE.
The Twelve Criteria of Psychopathy — HERE.
The Forty Most Common Logical Fallacies — HERE.
Concise Recommendations for Dealing with Misinformation — HERE.
THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind
#1 Truth — Honest opposition is practical, moral, and unbiased.
#2 Focus — Address only one or at most two points.
#3 Clarity — Easily understood without further explanation.
#4 Resonate — Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
#5 Respond — Lies not immediately refuted become the audience’s truth.
#6 Investigate — Collect and analyze their propaganda to understand their message, target audience & objectives.
#7 Source — Expose covert sources of false propaganda.
#8 Reason – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
#9 Disseminate — Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.
Citations
1. Wikipedia - Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors
2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy. Pages 14-27.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner"