There has been a good deal of talk about epistemic closure. The reality checks have been hitting the GOP with unmatched effectiveness. The skewed polls. The conservative bubble. “Legitimate rape.” Failed predictions. Karl Rove making his own reality, and defending it in the face of election results. Shell-shocked presidential contenders. Yes, the last two weeks have been good for popcorn with schadenfreude on the rocks. John Stewart must be thanking the Gods. But, let's not miss an opportunity to learn from their mistakes.
Confirmation Bias is a theory in psychology describing the act of selectively gathering and interpreting information. The bias involves rejecting evidence inconsistent with a presently held theory rather than rejecting theories inconsistent with new evidence. This bias has been studied for decades in Psychology and demonstrated in dozens of situations.
Peter Wason in 1960 coined the term Confirmation Bias to describe subject failure to predict the rule governing the selection of a number set. For example, subjects would be given a set of numbers: 2, 4, 8. When asked to guess the rule used to generate the number set, and given an opportunity to generate a series of challenge sets of their own, subjects consistently created number sets that conformed with their opinion. If their opinion was increasing even numbers, then the number sets created were 12, 14, 16, or 50, 52, 54. A subject would rarely correctly guess a rule that was substantially simpler than the given number set would suggest, such as, any three numbers.
Imagine looking at a projected photographic image that is so badly focused that identification is impossible. The picture is gradually focused until it is just slightly blurred, at which point you might identify the slightly blurred picture about 75% of the time. However, with prior exposure to the more blurred image, you can correctly identify it only about 25% of the time. Interpreting this finding, both Wyatt and Campbell (1951) and Bruner and Potter (1964) suggest that subjects' preliminary hypotheses formed on the basis of early, poor data, interfered with effective interpretation of later, better data.
Under What Conditions Does Theory Obstruct Research Progress
When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization". The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Subjects knew that one basket contained 60% black and 40% red balls; the other, 40% black and 60% red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, subjects in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These subjects tended to grow more confident with each successive draw—whether they initially thought the basket with 60% black balls or the one with 60% red balls was the more likely source, their estimate of the probability increased.This is how Karl Rove ended up on live TV arguing with complete certainty that there were more Republican than Democratic votes left to be counted in Ohio. We humans selectively search for, and interpret evidence that confirms our initial assessment... until we can't.
“The point is that we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield."Why are scientists immune to confirmation bias?
-George Orwell, In Front Of Your Nose
One might think that if everyone inherited this bias, then no one can trust the scientists that produced it. Scientists are not immune. The history of science is the history of wrong. However, science as a field has learned that there are factors that reduce the chances overconfidence. Competition is relied on a great deal and there is plenty of it. For any theory proposed, all scientists in the field have a motivation to disprove it, modify it, or replace it with a better theory. The public demonstration of replicable, objective evidence also help prevent wasting time on endless debate or impossible to prove theories. Under these conditions, the scientist who can best describe the available data has the successful theory. Essentially, no one knows how wrong they are until their competitors demonstrate better results.
Conflict Of Interest
There is an inherent conflict of interest in relying on someone, anyone, even a scientist to disprove their own theory. In science, grant proposals and publication both depend on the reputation of the author and the proposed theory. But also look at Rove, millions of dollars were riding on his ability to influence the election. That's a massive conflict of interest.
Once there is a competition there must be a method of determining winners and losers. Every field of science has it's own methodology. In politics, elections play the major role determining winners and losers.
“Nate Silver Was Right, And I Was Wrong”
So for the first time in years, we have witnessed an admission of conservative ideological error. We now have conservative pundits on record that ideology does not trump reality; Feeling like conservative turnout will be higher does not make conservative turnout higher. Mr. Rove is likely reconsidering the “we create our own reality” line of thinking he seemed to use when challenging the Ohio election returns live on Fox. The election has the qualities necessary to break the confirmation bias: fair public competition and objective evidence to determine the winner.
Raising Taxes will lower revenue? Want to bet?
We can do something about the blind obedience to ideology, the bubble. There are opportunities to create a similar situations, reality checks. President Obama has recently proposed an increase in the tax rate for the top 2% of the population. Without a doubt, some conservatives will oppose this with a supply side argument. At that time, anyone who is confident that they understand economic fundamentals could lure those blinded by ideology into a similar competition with an objective winner and loser. This will lead to a reality check. One way is with a wager. One could also build such a reality check into legislation. The taxes could be raised in steps on the top 2% over six month intervals, if and only if, revenue increases over the previous period. If a conservative truly believes their ideology, then they will consider the possibility of losing far more remote than an impartial observer and stray into a reality based contest. As a result of the undeniable loss that reality intrudes on the conservative idealist and breaks the bubble. Care must be taken though by anyone initiating a reality check because it may go either way. You may find that you are the one living in the bubble.
Faith in falsehoods are an absolute vulnerability.