Cognitive biases part 1 – when the traitor is you



Betrayal is among the most painful experiences we can endure.  But it happens far more often than most of us would like to believe and in ways that may not be as obvious to us as they should be.

“Impossible!” you say?   Think again.

A friend recently shared a story about betrayal.    She’s an attractive young woman with a child in a small-town kindergarten.  A few weeks ago, she received a sexually suggestive text message from the father of one of her child’s classmates.   He followed up his flirtatious text with “Oops.  Silly autofill.”

The author of the text is a man my friend knows only in passing, but in their brief encounters she’d felt that he had a “creepy vibe”.   My friend didn’t reply to his messages but instead shared them with the man’s wife.   The wife was initially both outraged and apologetic, telling my friend that her husband has cheated on her in the past,  is addicted to pornography, and that she’s caught him sending inappropriate texts to other women.    She thanked my friend for sharing the text messages with her.

Within a day, the same woman called my friend to tell her that she’d been wrong, that the offensive text was really just an accident.   Her darling husband was the unwitting victim of a predictive text feature.  Silly autofill.

From the perspective of a more or less dispassionate observer, the man is a lying, faithless rake.

His wife is not dispassionate.  Faced with the difficult prospect of holding her wretched lout of a husband accountable, some subconscious process blinded her to his perfidy.  So she gets to keep her twisted status quo and he gets away with yet another attempted dalliance.

Is she crazy?

Nope.  She’s just human.  And it turns out that humans, favored as we are with our large mammalian brains, are afflicted by some serious mental glitches that can lead to some very strange blind spots.

The Human Mind: Very Puzzling

Selfish genes

Most behavioral scientists believe that the human brain has developed over innumerable generations stretching back to our prehuman ancestors.  Scientists call the glitches and blind spots that developed along with our brains cognitive biases.  A cognitive bias can be defined as a systematic error in reasoning that affects decisions and judgments.  Based on the principals of natural selection, brains with cognitive biases have better survival characteristics than brains without them.

On the face of it, it’s hard to see an advantage to traits that hide reality from us or that interfere with our ability to reason correctly.  But it may help to remember that reproduction, not rational thought or an accurate understanding of reality, is not just the primary goal but also one of the drivers of natural selection.  It may also help to remember that natural selection isn’t always a very nice process.  Just ask a dinosaur.  Or the wife of a philanderer.

If this makes you uncomfortable, you might consider reading “The Selfish Gene” by Richard Dawkins.  In it, Dr. Dawkins explains how the messy business of biology is best seen from the point of view of individual genes.  Genes are molecules that have stumbled upon very effective, albeit self-serving, ways of preserving their existence through multiple generations.  Dawkin’s perspective helps us see that genes essentially create an organism around themselves that is designed to replicate the gene and pass is to the offspring of the organism that houses it.  Any trait, whether it’s cognitive, behavioral, or morphological, that facilitates the transfer of genes to progeny is favored and preserved.

Shortcuts with consequences

Cognitive biases didn’t develop just to make sure that we produce offspring.  Many of them arise from our tendency to employ heuristics, a sot of mental shortcut, that saves time and effort.    There are lots of different types of heuristics.  One example is the representativeness heuristic.  This might lead us to trust someone without learning enough about them to fully justify our trust because they remind us of someone that we know well that has proven to be trustworthy.  If you find yourself kindly disposed towards a new acquaintance because they remind you of your grandfather, your brain is employing this shortcut.

Heuristics are a necessity.  If we didn’t employ them, we’d become hopelessly bogged down in mundane decisions like whether to bring a raincoat for our walk in the park.  In general, we employ them successfully.  But sometimes they contribute to cognitive biases.

So here we are, the owners of what may well be the most complex assembly of matter in the universe, stuck with strange glitches that sometimes seem to betray us and at the very least often blind us to reality.  How in the world did we ever manage to create a technologically advanced society?

The answer is that our brains generally work pretty well, which is why we have smartphones, Tesla automobiles, and over 400 satellites in geosynchronous orbits.  Yes, our brains use heuristics, but they’re necessary.  Other than providing more opportunities for genes to pass themselves along and to facilitate faster decision-making, our cognitive biases don’t really do much harm, right?

If only it were so.

Despite our incredible technological progress, human cultures are deeply divided. And many of our interactions with those that don’t share our opinions about politics, religion, and even climate are characterized by rancor, confrontation, and sometimes brutal violence.

Is it possible that at least some of our social and societal dysfunctions arise from our cognitive biases and the blinders they produce?

Conflict and cognitive bias
Conflict and cognitive bias

The role of cognitive bias in social conflict

Behavioral scientists have recognized over 150 different cognitive biases. They’ve given them names like the Dunning-Kruger Effect, the Weber-Fechner Law, and post-purchase rationalization. They can be loosely categorized by four main problems that give rise to them: too much information to process thoroughly, inaccurate memory storage and/or retrieval, the need to think or act quickly, and the need to assign context or meaning to events.

One of the most pernicious biases is called confirmation bias. In simple terms, confirmation bias is our tendency to believe what we want to believe. And it is at the root of a great deal of trouble.

In 2004, Dr. Drew Westen of Emory University in Atlanta led a study into how confirmation bias works in our brains by using functional magnetic resonance imaging (fMRI).  And although scientists warn us about putting too much stock on fMRI studies of the brain, this one is worth considering in no small part because it offers a very plausible explanation for human behaviors.

The Emory researchers found that when a test subject with a strongly held political belief was presented with evidence that showed their belief was not well founded, that information was not processed by the part of the brain associated with reasoning.  Instead, the parts of the brain that process emotion, moral reasoning, and reward and pleasure were accessed sequentially.

The Emory research appears to show that our brains bypass our logic circuits in an effort to retain a cherished belief. When our brains succeed in disregarding information that would challenge us to think more clearly, they reward us with a burst of dopamine.

In other words, we’re wired to reject sound arguments that call our fondest beliefs into question and to feel good about it when we do.

You may want to let that sink in for a minute, or even read it again and follow the links to make sure I’m not mischaracterizing the research.  It’s at least a little stunning if it’s true.

The best defenses

Hopefully, by this point, you can see why cognitive biases in general and confirmation bias, in particular, are really important and that understanding how they work could change lots of things for the better.

Confirmation bias is a problem for people that are knowledgeable and well-informed.  Imagine how it affects people that are neither.  It’s not a happy thought.

Thankfully, Americans are very well informed about the political and religious ideologies that we defend so aggressively.  Right?

Um, well, no…not really.   Based on polling in 2014 by the Annenberg Public Policy Center at UPenn, only 36% of Americans can name all 3 branches of government.  And 60% don’t know which party controls the House of Representatives.  Which is probably ok since most of those don’t know exactly what the House of Representatives is anyway.

Well-informed or not, we tend to be very strongly opinionated and confirmation bias can help us cling to even the silliest notions.

silver lining
Silver lining

Overcoming cognitive bias

Once we’re aware of them, It’s pretty easy to see confirmation and other biases at work in the people around us.  But thanks to something called the bias blind spot, our brains work hard to ensure that we overlook bias in ourselves.

Fortunately, there are things we can do to overcome both our own biases and those in others.  As usual, it’s best to start with ourselves.

The first step in combatting our cognitive biases is to recognize that they exist.  By reading this article, you’ve already made some progress.  Don’t stop here.  This is a big topic with lots of well-sourced information.  It should be a mandatory part of public school curricula, but as far as I know, it’s not.  So educate yourself.

The second step is to consider all evidence with this idea in mind: how would I react to this evidence if it confirmed the opposite of what it seems to confirm?  This technique was tested by a research group working under Dr. Charles Lord at Princeton in 1979 and proved to be remarkably effective in helping partisans to more fairly evaluate information that could sway their opinions.  You can read more here.

The third step is to recognize that you’re more susceptible to cognitive biases when you have a strong emotional attachment to a particular idea or belief.  The wife that refused to see her husband’s philandering for what it was is again a good example.  Strong emotions are often a good indicator that your brain is or at least could be fooling you, and an angry reaction to some fact or evidence should be a huge red flag.   Make up your mind that you’re going to examine evidence that contradicts your fondest beliefs if for no other reason than to better understand those that don’t agree with you.  By committing to this simple measure, you’ll be well on your way to combatting at least a couple of pernicious biases.

The fourth thing you can do when you suspect that someone you know is under the influence of a cognitive bias is to resist the urge to tell them how stupid they or their ideas are.  In the book “The Influential Mind” by Tali Sharot, the author points out that it’s nearly impossible to alter someone’s opinion by arguing with them.  The best and possibly the only way to change someone’s mind is to simply present them with facts in a noncombative way.  When this is done in a spirit of camaraderie simply to share information, we tend to be far more receptive than when we’re being attacked as ignorant and gullible rubes.

Final Thoughts

Reality is complex.  It’s likely that we don’t have the capacity to accurately grasp it in its entirety.  That’s disconcerting at the very least, but it needn’t discourage us from trying to understand it as fully as we’re able.

Cognitive biases stand between us and a more accurate understanding of the way things really are.  They interfere with our relationships and color our opinions.  And they pit us against each other in too many ways to count.

There is little doubt that our biases have served the purposes of natural selection.  But as our technologies and cultures have begun to change radically over the course of decades instead of the millennia and even eons that have characterized most of our history, we seem to find ourselves saddled with anachronistic features that no longer serve our interests.

Overcoming built-in distortions in the lenses through which we see the world isn’t easy, but we need to make the effort.  Let’s get started.