A friend just emailed me a link to a math puzzle that was published in the New York Times in July 2015.
I don’t want to prejudice readers’ experiences with the puzzle by asserting how simple or hard I thought it was, or by suggesting how much time you might end up spending on it, but I do include spoilers below, so I will just say that you will undoubtedly get a different experience by attempting to solve the puzzle first and then reading the rest of this piece, than you would if you read on without following the link.
For those of you who want to experience the puzzle first, I’ll add a little filler space by quoting from the puzzle’s instructions here. My opinions are given afterward.
We’ve chosen a rule that some sequences of three numbers obey — and some do not. Your job is to guess what the rule is.
We’ll start by telling you that the sequence 2, 4, 8 obeys the rule.
Now it’s your turn. Enter a number sequence in the boxes below, and we’ll tell you whether it satisfies the rule or not. You can test as many sequences as you want.
When you think you know the rule, describe it in words below and then submit your answer. Make sure you’re right; you won’t get a second chance.
Please note that the emphasis <bold> appears in the original article.
Ok, now here’s a bit more from the NYT puzzle, after you submit a wrong answer.
The answer was extremely basic. The rule was simply: Each number must be larger than the one before it. 5, 10, 20 satisfies the rule, as does 1, 2, 3 and -17, 14.6, 845. Children in kindergarten can understand this rule.
But most people start off with the incorrect assumption … and then they make a classic psychological mistake.
They don’t want to hear the answer “no.” In fact, it may not occur to them to ask a question that may yield a no.
This disappointment is a version of what psychologists and economists call confirmation bias. Not only are people more likely to believe information that fits their pre-existing beliefs, but they’re also more likely to go looking for such information. This experiment is a version of one that the English psychologist Peter Cathcart Wason used in a seminal 1960 paper on confirmation bias.
First off, I commend the 2015 NYT author for developing a web page to illustrate the concept of confirmation bias — and to give readers a booster shot to help inoculate them against ill-designed tests. However, I disagree with what the author labeled as the incorrect assumption (“it must be a somewhat tricky problem”). Yes, I absolutely did make a wrong assumption, but it wasn’t related to the problem being tricky. It was that I saw an obvious pattern, and it blinded me from thinking about other possible rules. I tried a half dozen sets of numbers, and they were all over the map, but (a) I didn’t try any non-integers, (b) I didn’t try any negative numbers, (c) I didn’t try any obvious counterexamples to confirm the web-machine would correctly register my incorrect guess. As a scientist by education, I fault myself for this failure to expand my domain of eligible hypotheses into the realm of obviously wrong answers. But…
I also believe this was a flawed test and it concluded with an even more flawed evaluation. Although the test did say you could test as many combinations of three numbers as you wanted to test, it also implied some sense of urgency or finality by saying, “Make sure you are right. You won’t get a second chance.”
This admonition cuts both ways. Test all you want (…that’s good advice), but you won’t get a second chance (…that’s not very good advice when you are being scored by a dumb website and the consequence really isn’t very grave if you get it wrong).
In real life, part of getting more feedback happens precisely when you get rejected by someone you know. If the website response had just been “NOPE – your hypothesis is wrong” — instead of “NOPE you are wrong and here’s the right answer so you can never try again”, I would have guessed more and more generally and asked more and more exposing questions.
Yet, even with better (more forgiving) prompting on the final answer, it’s entirely possible that the rule actually could have been tricky enough so that no reasonable person would have ever guessed it. For example, the rule could have been “each number must be larger than the one before it as long as none of the numbers exceeds 6.02 E+23”.
At the risk of sounding like a sore loser, I also felt offended that test takers were declared an impediment to progress by not performing enough of the right kinds of tests to get the right answer the first time. That’s not how education works. Thomas Edison even said that he didn’t fail in all but one of the 1000 designs he attempted for his light bulb – he succeeded 999 times in determining how not to invent a light bulb.
In other words…Failure is ok. Failing itself does not constitute confirmation bias. The danger of confirmation bias enters the picture if we fail to acknowledge our failure – after it has been explained to us. If we willingly refuse to believe something after we see the contradictory evidence, that’s when our bias has overwhelmed our conscience.
I suspect none of the readers of this opinion piece is actually guilty of confirmation bias when confronted with reasonable evidence to the contrary. Capisce? Comprende? Verestehe? We are willing to understand if we are willing to experience failure — along the path to learning.
Now, let’s switch gears a moment and go back in time to the summer of 2016. Candidate Donald Trump is in Albany, NY and is holding one of his signature rallies when he coins a new phrase (emphasis mine):
“You are going to be so proud of your country if I get in. … Because we are going to turn it around, and we are going to start winning again. We are going to win so much. We are going to win at every level. … We are going to win so much, you may even get tired of winning.”
On the surface, this looks like typical political fluff. Give people hope (like Obama did with “hope and change”) and then give them something more — optimism and esprit de corps. But, be sure your exaggerations are as broad as possible, so you have a lot of latitude to define what winning means.
I believe this was a major turning point in Trump’s general election campaign against Clinton. People who were on the fence thought hey, maybe this guy really does have magic up his sleeve. Clinton (like Carter before her) doesn’t say stuff like that — and damn, she never ran a business like this guy did so maybe he will deliver!
America likes its clever marketers almost as much as they like their clever inventors. Steve Jobs and Henry Ford were good examples of business leaders who succeeded with both hats on.
But now nearly five years later, we have irrefutable evidence that Trump is the worst kind of liar possible — one who cares nothing about understanding. He cares nothing about getting the right answer. He only cares about getting people to believe him and disbelieve their own senses. Here is a good analysis of his modus operandi or you can just watch just the video here at the 1:22 timing mark.
In his Albany rally, Trump was setting up his audience not only to become victims of their own incipient confirmation bias, but to become willing accomplices in the spread of his future lies and future attacks on the rule of law and our democratic system.
Thinking can be a humbling process and often we find it difficult to overcome our own biases. Here is an article by Tom Stafford of BBC that we blue-thinkers ought to consider when we are trying to persuade our red-hatted neighbors and relatives that they ought to “Question Authority”— especially when it’s their own brain they are trusting implicitly (i.e., by rejecting all reasonable counter-arguments).
The Stafford article explores two theories of confirmation bias — the motivational theory (“be as open minded as possible”) and the cognition theory (“consider the opposite”). The results may amaze you, so please consider reading his illustrations.
I will leave you with Stafford’s closing statement. I believe that we can and must be the “someone else (who) points out the alternatives” to our misguided friends and neighbors. We must keep trying if we are to save our democratic form of government from demagogues who salivate thinking about 70 million voters with incurable confirmation bias.
The moral for making better decisions is clear: wanting to be fair and objective alone isn't enough. What's needed are practical methods for correcting our limited reasoning – and a major limitation is our imagination for how else things might be. If we're lucky, someone else will point out these alternatives, but if we're on our own we can still take advantage of crutches for the mind like the "consider the opposite" strategy. <From Stafford article linked above.>