Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
- Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
- Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
- Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
- Propaganda targets emotional reactions, not cognitive reasoning. [1]
- Counterpropaganda must target emotions as well as reason. [1]
- Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
- Continued Influence Effect
- Familiarity Backfire Effect
- Overkill Backfire Effect
- Worldview Backfire Effect
This the forth of four posts pertaining to REASON - Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation - Worldview and Backfire.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies:
The Nine Principles of Propaganda and Counterpropaganda — HERE.
The Twelve Criteria of Psychopathy — HERE.
The Forty Most Common Logical Fallacies — HERE.
Concise Recommendations for Dealing with Misinformation — HERE.
Cognitive Biases, Misinformation and the Brain
Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor was cited in our prior postings Reason #8b & 8c. We repeat it here because it gives a succinct and clear explanation of an issue of critical importance to this topic. It appeared in response to a Los Angeles Times article "Measles is deadly; vaccines are not," (2016 Feb. 10). [Emphasis added.]
Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.
When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).
In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.
A wise and wary frontal cortex knows all too well how wrong can feel so right.
Dale O’Neal, Clinical Psychologist, 16 February, 2019
Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it's all false. It's the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka "vigilant frontal cortex") can defeat the amygdala when it's in full fear mode.
Corrections in the Face of Existing Belief Systems: Worldview and Skepticism [2]
The following is a condensation of an important metastudy, “Misinformation and its Correction: Continued Influence and Successful Debiasing.” Lewandowsky, Ecker, Seifert, Schwarz & Cook. (2012 Sep 17). Pgs 27-35 (Link to printable copy) [2]
The Importance of the Recipient’s Worldview
We more readily accept statements which are consistent with our current beliefs, and it follows that our “worldview” or “ideology” are critical factors in the persistence of misinformation. For example, “Birther”-belief and WMD-related claims persists among Republicans despite retractions, but not among Democrats. Conservatives are better than liberals when judging risks from the consequences of higher oil prices. Conservatives are also better at recognizing the magnitude of risks from “peak oil.” Our pre-existing attitudes push us to persist in believing in worldview-consistent misinformation after retraction. Retractions of misinformation that President Bush’s tax cuts in the early 2000’s increased revenues and that Iraq had WMD’s were effective only among Democrats; it backfired among Republicans who became more committed to the misinformation. When people saw messages highlighting the adverse effects on health from climate change, Democrats increased their support for climate mitigation policies; it backfired among Republicans and their support declined.
When solutions are “framed” to fit the worldview of the audience, they accept them more easily. Republicans find “carbon offset” charges more acceptable than a “tax;” Democrats don’t bridle at the word “tax” and thus accept either term.
Even public-health messages can polarize. When information that Type 2 diabetes can be caused by poor nutrition and junk food – common among the poor – is presented, Democratic support increases for ameliorating changes in public policy, but among Republicans it declines. Republicans disbelieved rebuttals of the “death panel” myth; Democrats accepted them. Even product brands are affected. Those who love a particular brand suffer loss of self-esteem when they read negative information about the brand; those lacking emotional attachment to the brand remain unaffected.
Our pre-existing beliefs (“worldview”) affect the effectiveness of retractions of misinformation about real-world events. For example, supporters of the 2003 invasion of Iraq found it harder to accept the falsity of reports of WMDs in Iraq. The political-science literature confirms this view: the less one cares about a particular issue, the more one is influenced by factual or corrective information about it. Thus political issues per se do not necessarily lead to polarization.
Making things worse: Backfire effects
Society is damaged by misinformation when it concerns real-world issues such as climate change, tax policies, or decisions to go to war. In the real world, people tend to rely on misinformation that fits into their worldview and will be relatively immune to corrections. Retractions tend to backfire and strengthen their pre-existing beliefs. In studies where people are given information that challenges their worldview, they will “counter-argue” or remain unmovable (e.g., “We differ. I don’t believe them.” In other studies, people exhibit “motivated skepticism.” They uncritically accept arguments for their own position, but are highly skeptical of opposing arguments, and actively counter-argue to deride or invalidate any information. Such “boomerang” effects also appear when health messages are presented.
Belief polarization appears when the same information causes opposing views to further diverge. When religious believers and non-believers were exposed to a fictitious report disproving the Biblical account of the Resurrection, belief increased among believers and skepticism increased among non-believers. When presented with identical descriptions of nuclear power technological breakdowns, supporters focus on the safeguards that worked to prevent a worse accident, but opponents focused on the fact that the breakdown occurred in the first place. Techniques used to reduce belief polarization are very similar to techniques used to overcome worldview-related resistance to correction of misinformation.
Feelings of affiliation with a source are also a factor in information acceptance. Republicans accepted corrections of the “death panel” myth more readily when given by a Republican politician. Source credibility is also a function of belief: when you believe a statement, you judge its source as more credible. This cycle of resonance between belief and credibility can prevent opposing information from being judged sufficiently credible to overturn beloved beliefs, however false. One study showed that belief-threatening scientific evidence can lead to the discounting of the scientific method itself, and faith then trumps experiential facts. Even education can fail. Another study showed increasing education made it more likely that Democrats would view global warming as a threat, and less likely for Republicans. In another study, the more educated the Republican, the more likely they were to believe that President Obama was a Muslim (he is not). Few Democrats held this mistaken belief, and their level of education was not a factor.
We cannot yet completely rely on party affiliation or any other worldview measure in order to predict responses to misinformation correction. Neither do we completely understand the underlying cognitive processes. It may be that when we are heavily invested in our personal worldview, changing it to accommodate inconsistencies is too costly or too difficult. Our worldview may well function as an overall plan for processing related information, one which forces the rejection of uncomfortable new truths.
Taming worldview by affirming it
Studies show that debiasing messages and retractions must be tailored to fit into the specific audience’s worldview. Research shows that when solutions are “framed” to fit the worldview of the audience, they accept them more easily. For example, “eco-centric” people, likely to dismiss nanotechnology as inherently unsafe, accept it more readily when presented as environmental protection. Climate-change deniers are less resistant when information is presented as a business opportunity for the nuclear industry. Even simple changes in wording will increase acceptance when they make information less worldview-threatening. Republicans find “carbon offset” charges more acceptable than a “tax;” Democrats don’t bridle at the word “tax” and thus accept either term.
When messages are presented as “self-affirming” by including the opportunity to affirm their basic values as part of the corrective process – they are less worldview-threatening. When people were invited to write about a time when they felt especially good about themselves after acting on a value important to them, they became more receptive to potentially worldview-threatening messages. Encouraging self-affirmation seems to give the facts a “fighting chance.”
Such self-affirmation helped people accept negative information about a favorite brand product, apparently by letting them lower their brand esteem rather than their self-esteem. Helping people to face their worldview inconsistencies helps them to accept worldview corrections.
Skepticism—key to accuracy
Skepticism often reduces susceptibility to misinformation when people question the origin of information that may later turn out to be false. For example, people who initially questioned that finding and destroying WMD’s was the reason for the 2003 invasion of Iraq were more accurate in processing later war news. Initial suspicion brought greater accuracy when processing later information as well as greater accuracy in recognizing correct information, but did not cause “cynicism” – a blanket denial of all war-related information. Courtroom studies show that mock jurors continue to be influence by evidence later ruled inadmissible, even when they claim they are not, unless they became suspicious of the motives of the prosecutor who introduced the evidence.
Such skepticism interweaves with trust, research shows. Trust is fundamental in society; while distrust is often corrosive it can have a positive function. One study showed that people solve non-routine problems better after viewing a face rated as “untrustworthy” by others. In contrast, eliciting trust in people helps them perform better on routine (but not non-routine) problems. This suggests that distrust sensitizes people to their environment, awakening them to the non-routine.
“Nudging” is not tied to a specific delivery vehicle, which may not reach target audiences. Debiasing requires that the target audience actually receives the corrective information – difficult at best – but “nudging” automatically reaches everyone making the relevant choice.
Healthy skepticism or induced distrust seems to help us reject misinformation. These benefits apparently come from the non-routine, more “lateral” information processing that is primed when people are skeptical or distrustful. Skepticism at the time of message exposure helps more than skepticism which arises afterwards, and misinformation can prevail even when it or its source is later identified as intentionally misleading. In one study, people were presented with an attitude-changing report about a heroin-addicted child and the effectiveness of social youth-assistance programs. They then received a retraction, stating that the report was inaccurate because of either a mix-up (error condition) or because the author invented sensational ‘facts’ (deception condition). Retractions, especially concerning the deception, did bring participants to change their minds, but the effects of misinformation could not be erased in either condition. Misinformation’s effects on attitude lingered even after a retraction established the author had lied.
Using misinformation to inform
Brief interventions (e.g. “myth-vs.-fact”) do not work, but careful and prolonged dissections of incorrect arguments may. One experiment compared a standard teaching lecture with an alternative which explicitly refuted 17 common misconceptions about psychology while leaving other misconceptions unchallenged. Explicit refutation was the more successful method. One review of the literature likewise argues for argumentation and rebuttal in science education and that classroom studies “…show improvements in conceptual learning when students engage in argumentation.”
Argumentation and engagement with an opponent may work even in the political arena. An analysis of over 40 opinion polls overthrows conventional wisdom which claims that winning a debate requires avoiding dialog and highlighting your own issues. Some studies suggest even explicit misinformation can be used as a teaching tool. One study had students study “denialist” literature to learn about climate science – by analyzing misinformation and developing the skills required to detect its flaws, they gained actual knowledge. The in-depth discussion of misinformation and correction can help work through inconsistencies in their understanding and promote the acceptance of corrections.
Debiasing in an Open Society
Continuing the condensation of “Misinformation and its Correction" Pgs 35-36 [2]
Information moves fast and far in modern society and false information quickly takes root among the unwary. Knowledge of the spread and persistence of misinformation and how to effectively counteract it is of great practical importance. Consider Rwanda, where a year-long large-scale field experiment took place in 2008-09. It found that a radio soap opera built around messages of reducing intergroup prejudice, violence, and trauma altered listeners’ perceptions of social norms and their behavior – albeit not beliefs – when compared to a control group which heard a health-focused soap opera. This confirmed that large-scale change can be achieved using conventional media.
Concise recommendations for the practitioner
- Consider what gaps in people’s mental event models are created by your debunking and fill them with an alternative explanation.
- Repeated retraction can reduce the influence of misinformation, although this also increases the risk of a backfire effect when the original misinformation is repeated and thereby rendered more familiar.
- To avoid making people more familiar with misinformation (thus risking a familiarity backfire effect), emphasize the facts you wish to communicate rather than the myth.
- Provide an explicit warning before mentioning the myth, to ensure people are cognitively on guard and less likely to be influenced by the misinformation.
- Ensure your material is simple and brief. Use clear language and graphs where appropriate. If the myth is simpler and more compelling than your debunking, it will be cognitively more attractive and you risk an overkill backfire effect.
- Consider whether your content may be threatening to the worldview and values of your audience. If so, you risk causing a worldview backfire effect, which is strongest among those with firmly held beliefs. This suggests that the most receptive people will be those who are not strongly fixed in their views.
- If one must present evidence that may be threatening to the audience’s worldview, possible ways to reduce the worldview backfire effect are (a) present your content in a worldview-affirming manner (e.g., by focusing on opportunities and potential benefits rather than risks and threats) and/or (b) encourage self-affirmation.
- The role of worldview can also be circumvented by focusing on behavioral techniques such as the design of choice architectures rather than overt debiasing.
Three Problem Areas for Future Research
Continuing the condensation of “Misinformation and its Correction" Pgs 36-37 [2]
This survey of the literature enables us to provide a range of recommendations and draw some reasonably strong conclusions. However, this survey also identified a range of issues about which relatively little is known and which deserve future research attention. We wish to highlight three such issues.
The Role of Emotions is mixed. Emotionally stirring misinformation is not necessarily accepted more than emotionally neutral misinformation. Yet information likely to arouse others is passed on more often than truthful information, meaning that misinformation persistence may depend on its emotiveness. Also, information and retractions which challenge people’s worldviews makes them emotionally defensive.
Concerning the role of Individual Differences such as race or culture individual differences, people’s responses to the same information differ depending on their personal worldviews or ideology, but very little is known about other individual-differences variables. Intelligence, memory capacity, memory updating abilities, and ambiguity tolerance are just a few factors that could potentially mediate misinformation effects.
Concerning Social Networks, while “cyber-ghettos” have been studied, there is little understanding of the processes of misinformation dissemination through complex social networks and how these facilitate the persistence of misinformation in selected segments of society.
Psychosocial, Ethical, and Practical Implications
Continuing the condensation of “Misinformation and its Correction" Pgs 37-39 [2]
We conclude by discussing how misinformation effects can be reconciled with the notion of human rationality, before we address some limitations and ethical considerations surrounding debiasing, and point to an alternative behavioral approach to counteract the effects of misinformation.
The evidence shows: people can’t fully update memories with corrective information; worldview overrides fact; corrections can backfire. It’s tempting, but premature, to conclude that people are “irrational” or cognitively “insufficient.” For example, when belief polarization was studied within a Baysian network which captures the role of hidden psychological variables (e.g. during belief updating), it was found that behavior that initially appears “irrational” may actually represent a normal, rational integration of prior biases with new information.
Debiasing has ethical issues. We need a well-informed population, but debiasing techniques can also be used to further misinform people. Correcting misinformation is cognitively indistinguishable from misinforming people by disassembling their previously-held correct beliefs. The public must have a basic understanding of misinformation effects: Being aware of the fact that propagandists “throw mud” because they know it “sticks” is an important aspect of developing a healthy skepticism and a well-informed populace.
Debiasing can be inefficient. Backfire effects make debiasing most effect among people lacking strong beliefs concerning the misinformation. When misinformation agrees with a strong worldview, retractions can do more harm than good by further strengthening the misbelief. If debiasing can’t be framed in a worldview-congruent manner, don’t bother. Alternatively, you can ignore the misinformation altogether and seek more direct behavioral interventions. “Nudging” techniques can encourage certain decisions over others, without preventing people from making a free choice. People will adopt low-emission behaviors when “nudged” by tax credits even when already misinformed about climate science. Organ donation rates nearly double with the simple and transparent “nudge” of changing the default option from “opt-in” to “opt-out.”
Skepticism often reduces susceptibility to misinformation when people question the origin of information that may later turn out to be false. Initial suspicion brings greater accuracy when processing later information as well as greater accuracy in recognizing correct information, but does not cause the blanket denial of “cynicism.”
“Nudging” is not tied to a specific delivery vehicle, which may not reach target audiences. Debiasing requires that the target audience actually receives the corrective information – difficult at best – but “nudging” such as described above “automatically” reaches everyone who is making the relevant choice. It is especially applicable in these situations: when an entire population must adapt quickly to prevent negative consequences (e.g. the Montreal Protocol to rapidly phase out CFCs to protect the ozone layer); when ideology is likely to prevent the success of debiasing; when there are organized efforts to deliberately misinform people (e.g. tobacco smoke, climate change).
Vested interests can persist for decades in dispensing misinformation, as has the tobacco industry long after the causal link between smoking and lung cancer was established. By claiming that after 1964 there was still “room for responsible disagreement” with the U.S. Surgeon General’s conclusion that tobacco was a major cause of death, they are arguably trying to replace one myth (“tobacco does not kill”) with another (“the tobacco industry did not know it”). The primary strategy of such vested interests is to spread doubt about the uncertainty of scientific conclusions. Most people don’t understand the difference between scientific proof and syllogistic proof, and most people think a little uncertainty is as meaningful as a large uncertainty. We need to understand the cognitive mechanisms of misinformation effects, but we also need to monitor these socio-political developments in order to better understand why certain misinformation can gather traction and persist in society.
This is the eighth installment (part d) in our series on counterpropaganda.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies:
The Nine Principles of Propaganda and Counterpropaganda — HERE.
The Twelve Criteria of Psychopathy — HERE.
The Forty Most Common Logical Fallacies — HERE.
Concise Recommendations for Dealing with Misinformation — HERE.
THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind
#1 Truth — Honest opposition is practical, moral, and unbiased.
#2 Focus — Address only one or at most two points.
#3 Clarity — Easily understood without further explanation.
#4 Resonate — Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
#5 Respond — Lies not immediately refuted become the audience’s truth.
#6 Investigate — Collect and analyze their propaganda to understand their message, target audience & objectives.
#7 Source — Expose covert sources of false propaganda.
#8 Reason – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
#9 Disseminate — Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.
Citations
1. Wikipedia - Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors
2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy. Pages 14-27.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner"