A few additional thoughts about issues raised in my two most recent posts about how we automatically process new information to either accept or reject it depending on its relationship to values and beliefs already held.
Cultural cognition refers to the influence of group values — ones relating to equality and authority, individualism and community — on risk perceptions and related beliefs.
People with individualistic values, who prize personal initiative, and those with hierarchical values, who respect authority, tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire. By contrast, people who subscribe to more egalitarian and communitarian values are suspicious of commerce and industry, which they see as sources of unjust disparity. They are thus more inclined to believe that such activities pose unacceptable risks and should be restricted. Such differences, we have found, explain disagreements in environmental-risk perceptions more completely than differences in gender, race, income, education level, political ideology, personality type or any other individual characteristic4.
Cultural cognition also causes people to interpret new evidence in a biased way that reinforces their predispositions. As a result, groups with opposing values often become more polarized, not less, when exposed to scientifically sound information.
[
Kahan - above]
Similar to the “backfire” effect I’ve discussed elsewhere, [see this and this] the biases and inclinations already part of who we are serve as a powerful gatekeeper when we’re presented with new information on matters of importance to each of us. Our tendency more often than not is to assess the new data not with conscious deliberation, but based on instinctive, knee-jerk reactions to determine the value and merits of that information.
It’s often too disconcerting for us to give the time needed to consider the information in an objective light because the initial reaction we have is one of discomfort or unease. We get rattled by suggestions that what we’ve thought and believed all along might be wrong, or is some weird attempt to get us to agree with even weirder propositions. So we too quickly ignore or dismiss what’s been presented to us for a variety of intuitive reasons to help us maintain some emotional and psychological balance in our lives.
Who among us has the time to carefully evaluate reams of new data on matters of potentially great significance when those whom we associate with feel just as we do? As a life-long Boston Bruins hockey fan, I’m not going to give a Montreal Canadiens fan one moment of my time to consider the merits of rooting for that team, its storied history and legendary players notwithstanding. When I hear or read climate change deniers offer their rationales for disputing the volumes of evidence in support, I’m every bit as dismissive of the nonsense offered. Those automatic reactions to presentations contrary to what I’m already certain of puts me in the majority of about 100% of everyone else on all kinds of issues.
But debating the merits of the Bruins versus Canadiens poses limited risks to me today or long-term. With matters of genuine cultural and political importance, causal and immediate dismissal of new or contrary evidence may not always be as risk-free or inconsequential. The more information offered, the more we owe it to ourselves, our families, and our communities to pause before locating the nearest mental wastebasket.
Creating a better future may not be easy, but relying on easy to get us there carries risks we’d all be wise to consider for an extra moment or two.
Top Comments Submission Made Easy
|