Bias Blind Spot

Bias Blind Spot

We tend to see ourselves as less biased than other people. It's the tendency to recognize the impact of biases on the judgment of others while failing to see the impact of biases on our own judgment. It's a fascinating paradox that can affect individuals, teams, and organizations, leading to a lack of self-awareness, potential misunderstandings, and poor judgment.

Bias Blind Spot was first formally identified and named by researchers Emily Pronin, Daniel Lin, and Lee Ross in 2002.

Before this work, psychologists have long known that individuals tend to attribute their own actions to situational factors while attributing others’ actions to personal characteristics. This is a phenomenon known as the fundamental attribution error.

The Bias Blind Spot took this understanding a step further by focusing specifically on cognitive biases specifically. Their research found that even when people were educated about various cognitive biases, they consistently rated themselves as less prone to these biases than the average person. This held true across a wide range of biases, from self-serving attributions to the halo effect.

Further research has shown that the Bias Blind Spot is incredibly resistant to intervention. People tend to fall for it, even when they have a high understanding and knowledge of various cognitive biases.

Yikes!


On software teams, Bias Blind Spot can manifest in subtle yet impactful ways. It can impact the team when it comes to everything from hiring and promotions to strategy and execution.

Hiring managers might believe they’re objectively evaluating candidates based on qualifications, but unconsciously favor candidates with familiar backgrounds or experiences similar to their own. A manager might rate an employee highly across all categories based on strong performance in one area, without realizing they’re allowing this strength to overshadow potential weaknesses in other areas.

Besides impacting the team when it comes to hiring, performance reviews, and promotions, it can impact product vision, strategy, and execution. For example, a team leader might insist on pursuing a particular product direction, attributing team members’ objections to risk aversion or lack of vision. Meanwhile, they might fail to recognize their own overconfidence bias or sunk cost fallacy influencing their decision.

And when projects ultimately don’t go well, team members might be quick to blame their teammate’s biases or mistakes. They might not consider how their own biases contributed to the overall situation or problem.

🎯 Here are some key takeaways:

Acknowledge this is a thing

Try not to dismiss this as something that only affects others. Recognize that it's a universal human tendency, and you're just as susceptible to it as anyone else.

You're no better than anyone else

Avoid the trap of thinking you're the exception to the rule. Instead, approach situations with humility, acknowledging that your perceptions and judgments may be just as skewed as those of others.

Challenge your initial assumptions

Make it a habit to question your first impressions and instinctive reactions. Ask yourself why you hold certain beliefs or prefer certain ideas.

You won't be able to eliminate it

Its deeply ingrained in us. However, this doesn't mean you should give up. Focus on developing strategies to minimize its impact and create systems that help compensate for it in your team's processes.

Use debiasing techniques

Promote the use of structured decision-making tools, diverse review panels, and anonymous feedback systems within your team. These techniques can help counteract individual biases and lead to more objective outcomes.

Subscribe to get a new bias in your inbox every Friday!

    We will not SPAM you. Pinky swear!

    Type at least 1 character to search

    Thanks for signing up!

    Wil you help keep the show independent and ad free?

    Buy me a coffee

    $ 5
    • My heartfelt thanks
    • One time charge