The Truth Is Out There. The Lies Are In Your Head.
— Terry Pratchett
Hi folks! Time for another edition of my Bootcamp series (see bottom of this page for the full list of links to Logical Fallacies Bootcamp and Cognitive Bias Bootcamp diaries).
Since over the course of these series I’ve had several requests for it, I figure it’s time to finally do-— Confirmation Bias!
So let’s get rolling.
First, what is it? Confirmation bias is the tendency for us humans to (to borrow the definition straight from Dictionary.com, which gives a great definition) “the tendency to interpret new evidence as confirmation of one's existing beliefs or theories.
There are actually several types of confirmation bias, so let’s take a moment to look at each of them.
Biased search for information
Just like it says. Sometimes also referred to as the “congruence heuristic” if you’re looking for a fancier name for it to pull out at parties. Confirmation bias can lead us to seek out information that tends to affirm our existing beliefs. This is often how people start down the rabbit hole into hyperpartisan beliefs or into conspiracy theories. They find a bit of information that tips them toward believing something, and their further information seeking, instead of being balanced and seeking out sources that may disagree with the, they end up only looking at info that initial bit of belief, reinforcing it, and often those supporting sources will include OTHER things that add on to that belief system, and before long you’ve got people thinking vaccinations cause autism or lizard people run the world, or that Donald Trump had nothing to do with January 6th.
This one can manifest in multiple ways and it has to do with how a question is posed. For example, posing a question in a positive way can lead people to seek out positive information, while posing it as in a negative framing often has the opposite effect. An example of this was studied by asking participants in a study using a fake child custody case. When asked which parent should be allowed to keep the child participants looked primarily at the positive aspects of each parent. When reversed to “which parent should NOT be allowed to keep the child” participants primarily looked at negative aspects.
And this little bias is why internet searches can be… well let’s face it, they can suck and we have to be VERY careful about it when searching important information.
The problem is how algorithms for search engines often function. They are often designed to try to figure out what information you WANT to see, rather than the information you NEED to see or offering a balanced set of results. And this effect often builds over time. If Google knows you’ve been googling a bunch of stuff about UFOs being real, those results confirming that belief will tend to find their way to top results and contradictory stuff will fall down the list. Given that most people don’t search beyond the top few results or at least the first page of results, information that contradicts the information Google thinks they WANT gets buried.
And Google will also play with the way you frame the question to give biased results. Google “are cats better than dogs” and you’re more likely to get pro-cat sites. Reverse it about dogs vs. cats and you’re more likely to get pro-dog sites. So if you want a better search engine results, on thing to look at is how you’ve framed the question and try to either come up with as neutral a framing as you can for it, or try multiple variations of the question from different angles.
The answer to the dog v. cat question is, of course cats are better, as my totally unbiased and neutral cat Chester will tell you.
Biased interpretation of information
Again, pretty straightforward. While a piece of information may be accurate, different people look at it through different lenses, and based on confirmation bias may interpret that information in support of their own position. Two people can look at the same piece of information and each come away with a different interpretation of it.
This is what can make it difficult to change someone’s mind even after providing them with information that you think counters their position. They look at the information, apply their confirmation bias filter, and either find an interpretation of the information that fits into their bias, or they reject it as less valid than information they feel confirms their beliefs.
A 2006 study (abstract here) tested subjects by using MRI imaging. Participants were asked about their political candidate preferences, then given information about the opposing candidate, their preferred candidate, and on neutral nonpolitical subjects.
What the researchers found was
Motivated reasoning was associated with activations of the ventromedial prefrontal cortex, anterior cingulate cortex, posterior cingulate cortex, insular cortex, and lateral orbital cortex. As predicted, motivated reasoning was not associated with neural activity in regions previously linked to cold reasoning tasks and conscious (explicit) emotion regulation. The findings provide the first neuroimaging evidence for phenomena variously described as motivated reasoning, implicit emotion regulation, and psychological defense. They suggest that motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached.
Boiled down from the medical and professional jargon, this means that people brains were actively engaging in motivated reasoning when presented with negative information about their candidate to reduce the cognitive dissonance associated with receiving negative information about them.
People tend to be more quickly and uncritically accepting of information that supports their existing views while being more critical of and more likely to look more deeply at contradictory information as well.
Biased recall of information
Also sometimes called “memory bias,” this one relates to a couple of different memory theories. One theory is that striking information is more easily memorized and so tends to “stick” better in our brains. Another is that information that confirms out worldview is more easily remembered as well and we tend to discard contradictory information.
This bias can also reinforce stereotypes as people fall back on the comfortable “everyone knows” type of information and better recall the things that conform to well-known stereotypes. For example, librarians are stereotyped as introverted bookworms and so in a study done where participants were given a fictional woman with a mix of introvert and extrovert traits and then were asked to determine her suitability to be hired as a librarian, participants tend to remember and focus much more on the introvert traits they’d been given. Another group given the same information but who were asked to determine the fictional subject’s suitability to be a real-estate agent tended to remember and focus much more on the extrovert details.
The wrap-up
As is usually the case, I feel like a broken record, but as always the best way to deal with confirmation bias in ourselves is to always be taking an inward reflection and doing our best to come at information while consciously asking ourselves if our own biases might be playing into how we’re reading the information. We also need to be willing not to get stuck in “information silos” where all the information we get is just that information fed to us that confirms our existing beliefs. If all your news sources are saying the same thing all the time, an all your social media you use and friends circle have been pared down to people and sites that feed your preconceived beliefs, you’re not doing yourself any favors. We’ve seen the results of such things with the Right Wing Echo Chamber and how it’s produced fanatics like the type on trial for January 6th, or the harassment of survivors of Parkland or the parents of the Sandy Hook victims. But this sort of thing isn’t just partisan to one side. There are plenty of folk on the left that have fallen off the cliff in some way as well. Not trying to bothsides things here, but give a warning that we don’t want to fall into traps that feed our innate biases if we can help it.
When searching online, try to do multiple searches using as neutral a language as you can, or ask from multiple angles (for example, my dog v cat example — ask the question from the dog side AND the cat side to get both POV’s). And don’t be afraid to go a page or two deeper into the results, sometimes you’ll find that the information you really want has been buried a little deeper because your search engine is trying to read your mind and feed you want it thinks you want rather than need.
Life would be easier if our brains came with an owner’s manual.
Then again, we know we’d all just skim it and toss it in a closet and end up winging it anyway.
Until next time, folks!
Prior Bootcamp Installments
And, as promised, here are links to previous installments:
Logical Fallacies Bootcamp:
The Strawman
The Slippery Slope
Begging the Question
Poisoning the Well
No True Scotsman!
Ad Hominem
False Dilemma
Non Sequitur
Red Herring
Gamblers Fallacy
Bandwagon Fallacy
Appeal to Fear
The Fallacy Fallacy
Appeal to Personal Incredulity
Appeal to Authority
Special Pleading
Texas Sharpshooter
Post Hoc
Appeal to Nature
Furtive Fallacy
Alphabet Soup
Cognitive Bias Bootcamp:
Bystander Effect
Curse of Knowledge
Barnum Effect
Declinism
In-Group Bias
Hindsight Bias
Survivor Bias
Rhyme-as-Reason Effect
Apophenia (& Paradoleia)
The Dunning-Kruger Effect