Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
- Propagandists deliberately use errors in arguments to appeal to the emotions of their audience. [1]
- Propagandists exploit cognitive biases and other elements of decision-making when shaping their messages to influence the target audience. [1]
- Awareness of logical fallacies in our own reasoning helps people reject misinformation directed at them. [1]
- Propaganda targets emotional reactions, not cognitive reasoning. [1]
- Counterpropaganda must target emotions as well as reason. [1]
- Four cognitive effects interfere with corrections to propaganda and misinformation: [2]
- Continued Influence Effect
- Familiarity Backfire Effect
- Overkill Backfire Effect
- Worldview Backfire Effect
This the second of four posts pertaining to REASON - Counterpropaganda Principle #8: Logical Fallacies; Cognitive Biases; Continued Influence Effect of Misinformation; Debiasing Misinformation - Worldview and Backfire. The human ability to believe lies and to maintain such belief in the face of contrary evidence lies at the root of the problem of propaganda and counterpropaganda. The amount of material necessary for a useful explanation is far too much for a single blog.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies are HERE:
The Nine Principles of Propaganda and Counterpropaganda
The Twelve Criteria of Psychopathy
The Forty Most Common Logical Fallacies
Cognitive Biases, Misinformation and the Brain
Cognitive biases and the peculiarities of our brain structure which appeared during the course of human evolution affect us not only when dealing with well-designed propaganda, but in all aspects of our daily lives. The following letter to the editor gives a succinct and clear explanation. It appeared in response to a Los Angeles Times article "Measles is deadly; vaccines are not," (2016 Feb. 10). [Emphasis added below.]
Anti-vaccination mania stems from a brain wiring issue that plagues our species. It involves the interaction between the amygdala, which is designed to protect us, and the frontal cortex, which is designed to keep the amygdala honest.
When the amygdala is activated by threats real or imagined, it triggers the release of adrenaline which, in turn, hinders or even blocks access to the frontal cortex. The amygdala wants us to react, not think. The downside of this necessary mechanism is how quickly irrational threats (especially when powerfully and cleverly packaged) can go viral. With adrenaline pumping, groupthink and confirmation bias quickly can kick in, and presto, the irrational becomes self-evident (think climate change denial, slavery and countless other examples).
In the face of any threat, a vigilant frontal cortex is essential, especially when that perceived threat is counter to scientific consensus. It’s not that science is never wrong, but for people to reject it, the opposing evidence should be overwhelming, which is clearly not the case here.
A wise and wary frontal cortex knows all too well how wrong can feel so right. — Dale O’Neal, Clinical Psychologist, 16 February, 2019
Whether misinformation is about vaccinations, climate science, occult beliefs, black helicopters, waves of criminals pouring over our borders, abductions by extra-terrestrials or an Illuminati takeover, it's all false. It's the structure of our brain which causes us to seize onto such scare stories as a cause for anxiety and fight-or-flight readiness, and to persist in believing it despite later facts, corrections and retractions. Only a skeptical mind (aka "vigilant frontal cortex") can defeat the amygdala when it's in full fear mode.
Confirmation Bias: The tendency to search for, interpret, focus on and remember information in a way that confirms your preconceptions. By re-confirming your views, you reduce inconsistencies in both information and belief, thereby reducing minimizing uncomfortable feelings of cognitive dissonance.
Counterpropaganda and the Difficulty of Exposing Cognitive Reasoning Errors [3]
Propagandists exploit our human cognitive biases and logical fallacies in order to influence us. They know our weaknesses, shape their message to be credible and emotionally attractive, and slip their propaganda through these holes in our defenses. Counterpropagandists try to negate propaganda messages through exposing and resolving the target audience's errors in judgment. This method works similarly to Source (Counterpropaganda Principle #7) which, by revealing the true origin of a propaganda message, reduces the credibility of its broadcaster by exposing them as an enemy, dupe or liar. The hope is that the target audience, when made aware of their own logical fallacies, will then reject messages based on this faulty reasoning.
Unfortunately, recent studies strongly suggest that the effectiveness of propaganda is not predominantly based on cognitive reasoning errors, but upon emotional reactions, and propaganda works best when focusing on our emotions. (see Propaganda Principles #4 Blame, #5 Provoke, #6 Crisis, #7 Emotional Symbols, #8 Pander ) Therefore, exposing a group's logical errors is not as effective in refuting propaganda messages as was originally believed.
Jacques Ellul [3][4] argues that the speed at which events occur, become outdated and uninteresting causes people to be too inattentive and unaware to seriously think about current events. The more superficial we become, the more effective becomes propaganda. We are just too superficial to care about our own cognitive biases and logical fallacies. Yet Ellul stresses that this element of counterpropaganda remains critical because it exposes our vulnerability to propaganda based upon our own mental vulnerabilities.
Cognitive Biases [5]
Cognitive biases are systematic patterns of deviation from rationality in judgment. Although we all share the same physical world, we each interpret our sensory input from this world to construct our own "subjective social reality." It is this interpretive construction, not the objective input from the world, which often dictates our social behavior in the world. Thus innate cognitive biases can lead to perceptual distortions, inaccurate judgments, illogical interpretations, or what is often called irrationality. Our cognitive biases are adaptive and beneficial when they lead to more effective decisions and actions within a given context, particularly when rapidity outweighs accuracy, when information is unavailable, or when information proliferates beyond comprehension. These biases appeared – and then persisted – during the course of human evolution because their benefits outweighed their detriments. But as human population grows and social relationships become ever more numerous and complex, this balance of benefits may be shifting towards the negative. We now need to know that these biases exist, how they work, and how they can be neutralized when necessary.
Here are thirty cognitive biases which pertain to propaganda and counterpropaganda. [6]
Affinity Bias: The tendency to associate with people like ourselves. Similarity can be based on race, religion, politics, age, sex, nationality, education, wealth, and so on.
Anchoring (or Focalism) Bias: The tendency to rely too heavily, or "anchor", on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).
Availability Cascade: A self-reinforcing process in which a collective belief gains more and more plausibility through increasing repetition in public discourse ("repeat something long enough and it will become true"). (See Propaganda Principle #3 — REPEAT.)
Availability Heuristic: The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.
Backfire (or Continued Influence) Effect: The reaction to retractions, correction and disconfirming evidence by strengthening one's previous reliance on misinformation.
Bandwagon Effect: The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.
Belief Bias: Evaluation of the logical strength of an argument is biased by your belief in the believability of the conclusion.
Confirmation Bias: The tendency to search for, interpret, focus on and remember information in a way that confirms your preconceptions. By re-confirming your views, you reduce inconsistencies in both information and belief, thereby reducing minimizing uncomfortable feelings of cognitive dissonance.
Correspondence Bias: In contrast to interpretations of our own behavior, we tend to unduly emphasize the presumed internal characteristics (character or intentions), rather than external factors, when explaining the behavior of others. This effect has been described as "the tendency to believe that what people do reflects who they are."
Declinism: The predisposition to view the past favorably (“rosy retrospection”) and future negatively.
Dunning-Kruger Effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
Empathy Gap: The tendency to underestimate the influence or strength of feelings, in either oneself or others.
Framing Bias: Using a too-narrow approach and description of the situation or issue. It is a setting which causes people to react based on the way the brain makes comparisons - loss vs gain : inexpensive vs expensive : better vs worse.
Hindsight Bias: The “I-knew-it-all-along” effect; the tendency to see past events as being predictable at the time those events happened.
Illusion of Control: The tendency to overestimate one's degree of influence over other external events.
Illusion of Validity: Belief that our judgments are accurate, especially when available information is consistent or inter-correlated.
Illusory Truth Effect: A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual truth. These are specific cases of truthiness.
Irrational Escalation: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.
Overconfidence Effect: Excessive confidence in one's own answers to questions. For example, for certain types of questions, answers that people rate as "99% certain" turn out to be wrong 40% of the time.
Priming Bias: The tendency to be influenced by a preconceived idea caused by what someone else has said. It is a setting which causes people to react based on how the brain groups information. You are more likely to recognize the word 'n_rse' as ‘nurse’ when it follows 'doctor,' but as ‘norse’ when it follows ‘viking.’
Reactance: The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice. Related to ‘reverse psychology.’
Reactive Devaluation: Devaluing proposals only because they purportedly originated with an adversary.
Rhyme as Reason Effect: Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense's use of the phrase "If the gloves don't fit, then you must acquit."
Selective Perception: The tendency for expectations to affect perception.
Semmelweis Reflex: The tendency to reject new evidence that contradicts a paradigm you use.
Self-Serving Bias: The tendency to more often claim responsibility for successes than for failures. The tendency to evaluate ambiguous information in a way which benefits your own interests.
Social Comparison Bias: The tendency, when making decisions, to favor potential candidates who don't compete with one's own particular strengths.
Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.
Third-Person Effect: Belief that mass communicated media messages have a greater effect on others than on themselves.
Zero-Risk Bias: Preference for reducing a small risk to zero over a greater reduction in a larger risk.
Cognitive biases such as these enable us to quickly process information and to have greater understanding about what the rest of our group is thinking and feeling. In fight-or-flight situations for our ancestors, rapid decision making could easily make the difference between survival and death. Similarly, their group (family, clan, tribe, etc.) was essential to their individual survival, and they’d better know how to survive within the group and cooperate with the rest of the members. The cognitive biases evolved for good reasons, and – like it or not – we’re stuck with them. If we don’t become aware of our innate biases and learn to recognize and deal with them, the propagandists among us will continue to use them against us. Your own biases are either your tools, or theirs.
There are many more cognitive biases. Your Bias Is lists 24 in a very user-friendly format: anchoring, availability, backfire, barnum effect, belief, bystander, confirmation, curse of knowledge, declinism, Dunning-Kruger effect, framing, fundamental attribution error, groupthink, halo, in-group, just world hypotheses, negativity, optimism, pessimism, placebo, reactance, self-serving, spotlight, sunk cost fallacy.
Wikipedia’s List of Cognitive Biases List of Cognitive Biases contains 190 items: 113 decision-making, belief & behavioral biases, 28 social biases, and 49 memory errors and biases. Don’t be biased against looking them up.
This is the eighth installment (part b) in our series on counterpropaganda.
Our Daily Kos blog reports:
The Nine Principles of Propaganda begins HERE.
Trump - Our Psychopathic President begins HERE.
Double-sided PDF copies are HERE:
The Nine Principles of Propaganda and Counterpropaganda
The Twelve Criteria of Psychopathy
The Forty Most Common Logical Fallacies
THE NINE FUNDAMENTAL PRINCIPLES OF COUNTERPROPAGANDA
Propaganda is the backdoor hack into your mind
#1 Truth — Honest opposition is practical, moral, and unbiased.
#2 Focus — Address only one or at most two points.
#3 Clarity — Easily understood without further explanation.
#4 Resonate — Identify audience’s existing sentiments, opinions, and stereotypes that influence their perspectives, beliefs, and actions.
#5 Respond — Lies not immediately refuted become the audience’s truth.
#6 Investigate — Collect and analyze their propaganda to understand their message, target audience & objectives.
#7 Source — Expose covert sources of false propaganda.
#8 Reason – Expose their logical fallacies. Human cognitive biases for rapid thought response make us vulnerable to faulty reasoning.
#9 Disseminate — Share exposed propaganda with audiences not targeted; they can then recognize the lies and reciprocate.
Citations
[1] Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors
2. Lewandowsky, Stephan; Ecker, Ullrich K. H.; Seifert, Colleen M.; Schwarz, Norbert; Cook, John. (2012 Sep 17). “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest, Vol. 13, No. 3, 106–131. doi: 10.1177/1529100612451018. A metastudy.
Read paper on Academia.edu
SagePub.com abstract and paper
SagePub.com purchase portal
ResearchGate portal
Link to free copy of the entire paper and chart; retrieved 1-26-19 from: http://bocktherobber.com/wordpress/wp-content/uploads/2013/04/lewandowsky_et_al_misinformation_pspi_ip.pdf
Link to useful chart of “Concise Recommendation for the Practitioner"
3. Adapted from Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors
4. Ellul, Jacques (1973). Propaganda: The Formation of Men's Attitudes (Reprinted ed.). New York: Vintage Books. ISBN 978-0-394-71874-3. Cited by Wikipedia – Counterpropaganda. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Counterpropaganda#Expose_reasoning_errors
5. Adapted from Wikipedia – Cognitive Bias. Retrieved 2-4-19 from: https://en.wikipedia.org/wiki/Cognitive_bias
6. Wikipedia’s List of Cognitive Biases contains 190 items: 113 decision-making, belief & behavioral biases, 28 social biases, and 49 memory errors and biases. Fear not; look them up. Retrieved and adapted 2-4-19 from: https://en.wikipedia.org/wiki/List_of_cognitive_biases
Additional Reading
What is the difference between framing and priming effects? Gilman, Jeff. (3-26-16). Quora.com
https://www.quora.com/What-is-the-difference-between-framing-and-priming-effect