5 Amazing Ways We Fool Ourselves

Photo by Steve Harris on Unsplash

Amazing Ways We Fool Ourselves

Historically, April 1st should be the most light-hearted day of the year.

If ever we needed a splash of levity in our sea of seriousness, it’s now.

It’s a day of hoaxes, pranks, and practical jokes with people we love. The best part is nobody gets offended, at least they’re not supposed to. If the recipient responds with cursing or tears, you know things have gone too far.

It’s a chance for self-deprecating humor. It acknowledges that there’s a certain amount of folly that resides in each of us.

Back on April 1st,1976, the BBC nailed it. British astronomer Patrick Moore announced on BBC Radio 2 that at 9:47 AM, a once-in-a-lifetime astronomical event was going to occur. Listeners could experience this in their very own homes.

Due to this unique alignment of planets, the earth’s gravity would be reduced by a certain level. Whoever jumped with all their might at just the right time could possibly float in the air! People worldwide who hadn’t noticed the date were jumping up and down, hoping that they could achieve levitation. Other classics are the penguins of King George Island taking flight and the great spaghetti harvest.

Special day aside, there some amazing ways we fool ourselves year-round. Sometimes this comes with tragic consequences. It’s become more evident and rampant over this past year. That’s what I’m writing about this month.

Nobel prize-winning economist and social psychologist Daniel Kahneman states that we have two thinking systems. One thinking system is “fast” and the other, “slow.”

Fast thinking is the area where we can often fool ourselves.

We apply mental shortcuts, “hacks,” or biases when problem-solving or deciding things. It’s the realm of gut instincts, snap judgments, and hardwired systemic flaws in our thinking.

Slow thinking is a more deliberate examination of thoughts and motives.

We need both systems.

Biases are those deeply ingrained codes in our caveman software that can’t be quickly unlearned.

They have a profound impact on the following:

  • Our Perception – how we see people and perceive reality.
  • Our Attitude – how we react towards certain people.
  • Our Beliefs – how we interpret and respond to events
  • Our Behaviours – how receptive/friendly we are towards certain people.
  • Our Attention – which aspects of a person we pay most attention to.
  • Our Listening Skills – how much we actively listen to what certain people say.

It’s helpful to think of them as optical illusions. You know- things that appear to be there but really aren’t. Or that photo distortion app that makes for a very unflattering selfie.

Here are just five of the biases I’ve run into recently.

Negativity Bias (Good Plus Bad=Bad)

We want to think we’re rational, well-adjusted human beings, but our brains are naturally hardwired toward the negative.

Have you ever found yourself over-thinking a mistake you made a while ago? Are you replaying in your head a conversation that didn’t go so well?

That’s the negativity bias at play: not only do we register negative stimuli more readily, but we also tend to dwell on these events for longer.

A Queen’s University research study estimates the average person has about 6,200 thoughts per day. Other studies indicate that a high percentage (67%- 80%) are negative, and up to 95% are exactly the same repetitive thoughts as the day before.

So…if 80% of our thoughts are negative and 95% of them are repetitive, we have a serious “built-in” flawed perception problem.

Quite simply, negative events have a more significant impact on our mental state than positive ones.

Kahneman suggests an end-of-day exercise where we intentionally reflect on at least three good things that happened that day to bring positive counterbalance to our natural tendencies.

While the negativity bias may have been a helpful survival mechanism for our ancestors, today, it has a powerful—and often unconscious—impact on how we behave, think, and make decisions.

Groupthink Bias

Groupthink is a genuine phenomenon that happens when a group of well-meaning people makes dumb decisions to identify or belong to a particular group.

Another term for this is conformity bias.

In this scenario, any kind of dissent is unwelcome. Any reasoned questioning automatically makes one a social leper.

This bias is often fueled by a particular agenda—plus the fact that group members value harmony and coherence above critical thinking.

This bias causes people to simply “follow the herd” rather than thinking things through and using their own independent ethical judgment.

History is riddled with tragic examples of groupthink. The mass suicide known as the  Jonestown Massacre is just one of them. Hence the dark meme “Don’t drink the Kool-Aid.”

Confirmation Bias

Confirmation bias is the strong tendency of people to seek out information that exclusively supports views they already hold.  Evidence and information get interpreted in ways that affirm their pre-existing beliefs, expectations, and hypotheses.

Any contradicting evidence or information that may lead to a different conclusion is ignored.

A humorous illustration of this is the Texas Sharpshooter fallacy, where the cowboy unloads his pistol at the fence and then paints a bullseye around the closest cluster. He wants to believe he’s a good shot and manufactures proof to support that notion.

A close cousin to this is the Belief bias. If I believe something strongly enough, it must be true.

The question to be asked. Is this really true? Or do I just want it to be?

This thought pattern can easily lead to conclusions that are inaccurate or even unethical.

 

Diffusion of Responsibility Bias

Diffusion of responsibility occurs when a leader needs to decide but then waits for someone else to act instead. It becomes a ripple effect. The greater the number of people that are aware or involved, the more likely it is that each person will do nothing, believing someone else from the group will probably respond.

Psychologists John Darley and Bibb Latané set up a Bystander Apathy experiment where a distress call made it appear that a person nearby had suffered an injury. When subjects heard the cry, and though they were the only ones who heard it, 85% of them helped.  But if subjects thought there was another person who heard the call too, only 62% helped. And if subjects thought that four other people also listened to the cry for help, just 31% took action.

Diffusion of responsibility makes us feel less pressure to act because we believe, correctly or incorrectly, that someone else will do it. When we don’t feel responsible for a situation, we feel less guilty when we do nothing to help.

In this way, diffusion of responsibility keeps us from paying attention to doing the right thing or ignoring our own conscience.

It’s complicated, but there is hope. Here’s a TEDx explainer.

Self-Serving Bias

The self-serving bias is where we seek out information and use it in ways that advance personal self-interest. We often unconsciously make selfish decisions other people might view as questionable.

It can also take the form of a person taking credit for positive events or outcomes, but blaming outside factors for negative events.

The irony is that we can easily spot this trait in others, but we have difficulty seeing it in ourselves.

An example might be doctors who believe that they are immune from the influence of gifts they receive from pharmaceutical companies. Studies show those gifts have a significant effect on what medications doctors prescribe. One study found that 64% of doctors believed that the freebies they received from suppliers influenced other doctors. However, only 16% of doctors thought it affected their own actions.

So, the self-serving bias often blinds us to how we are prejudice in favor of ourselves. Indeed, it can cause even the most well-intentioned of us to overlook our own wrong actions completely.

To summarize, these five biases are just a small random sampling. The good news is that when we encounter them, we can switch to “think slow “ mode and ask some questions.

Here are some helpful questions the I borrowed from Annie Duke’s book THINKING IN BETS

  • Why might my belief not be true?
  • What other evidence might be out there bearing on my belief?
  • Are there similar areas I can look toward to gauge whether similar beliefs to mine are true?
  • What sources of information could I have missed or minimized on the way to reaching my belief?
  • What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?
  • What other perspectives are there as to why things turned out the way they did?

Hope this helps,

Until next time,

 

Lorne