‘All our customers will love the new feature’
‘This is how its done here’
‘It has been happening this way since forever’

These are some common examples of cognitive biases.

Our human brain takes in a lot of information on a daily basis. To filter it out, our brains rely on mental shortcuts to process information quickly and make decisions. We observe patterns and make decisions on the basis of our past experience and judgements - the root cause of having biases in thinking.

If we don’t keep a check on these shortcuts, they lead to cognitive biases that further lead to a series of ineffective decisions. And eventually it might lead to unexpected failure if you’re working on a project or product.

As product managers, we are constantly making decisions that lead to success or failure of the product. For ensuring success of the product, it is crucial to make the right decisions. So how do you make right decisions?

By making rational decisions.

It’s not always that simple. Our brain is fogged by our past experiences, emotions and beliefs. We don’t think rationally and don’t judge the situation as critically as needed (especially when the stakes are high and we’re facing pressure).


Bhavya Mihira, Senior Product Manager at Nike Amsterdam, hosted a meetup with us where she talked about overcoming cognitive biases in product management.

She shared a case study of Space Shuttle Columbia’s last mission to explain the cognitive biases that led to the disaster. These cognitive biases are commonly seen in teams where the right systems of decision making are not followed.

In this blog we will address the 3 cognitive biases commonly found in product managers by looking at the example of Columbia’s Last Mission. And how to overcome such biases while decision making.

Columbia’s Last Mission in short

  • Space Shuttle Columbia had 27 successful missions over 22 years prior to the tragedy
  • January 16th 2003- Space Shuttle Columbia took off from Earth to the International Space Station carrying 7 astronauts— Rick Husband, commander; Michael Anderson, payload commander; David Brown, mission specialist; Kalpana Chawla, mission specialist; Laurel Clark, mission specialist; William McCool, pilot; and Ilan Ramon
  • February 1st 2003 — The Space Shuttle burnt up on re-entry to the earth. Killing all 7 astronauts onboard
  • The US space shuttle program suffered a severe setback after this tragedy

The Foam Problem

  • A piece of insulating foam from the external tank hit one of the wings of the orbiter (where the astronauts were housed) causing severe damage
  • This had been the biggest piece of foam that had ever hit the orbiter
  • NASA knew all the above facts before letting the shuttle come back to earth
  • The foam damage caused the shuttle to burn up on re-entry

One person or one decision cannot be blamed for this disaster. This tragedy was a result of multiple wrong judgements and decisions by a group of people.

We also can’t blame one cognitive bias for the failure of our product. In hindsight, it's always a cocktail of cognitive biases that cause the failure.

This is called the Lollapalooza effect.

It is a concoction of multiple cognitive biases acting together that can lead to an extreme outcome.

In the Colombia’s Last Mission, there were multiple biases that resulted in the tragedy.  There are other factors which result in the failure of a project, but they all come down to the wrong decisions made by the team or their leaders.

The three biases that we will critically focus on in this blog:

  1. Confirmation Bias
  2. Normalisation of deviance
  3. Outcome Bias

Confirmation Bias - and how it affects product management

Likelihood of cherry picking information that confirms your existing beliefs. We tend to ignore information that do not confirm our preconceptions.

A classic example of confirmation bias is; if you google the cause of your minor headache thinking it could be a tumor, you are more likely to click on search results that confirm this information rather than links that say the information otherwise.

Farnam Street


In a team environment, when we’re working on products and research, we end up asking leading questions in the hope of getting favourable answers. And we only focus at those metrics that confirm our hypothesis and beliefs, making the findings inherently flawed.

In the case of the Colombia space shuttle tragedy, Shuttle Managers sought out information exclusively from experts who had downplayed the risk of foam in the past.

On investigation, Linda Ham’s (the mission manager) email conversations showed she asked leading questions like-

“Can we say that for any (external tank) foam lost, no ‘safety of flight’ damage can occur to the Orbiter because of the density?”

In the context of product management,  confirmation bias is seen during user research where interviewers tend to ignore the pain points of the users, especially those that don’t match with the interviewer’s beliefs. Interviewers ask leading questions like- This app feature is useful, right?

Product managers conclude an A/B test even before it has reached statistical significance because early results confirm the hypothesis. They tend to focus on metrics that confirm that beta test has been a success thereby recommending a full market launch.

Normalisation of Deviance

“Social normalisation of deviance means that people within the organisation become so much accustomed to a deviation that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety” —Diane Vaughan, 1996.

Simply put, it is the gradual normalization of practices and their outcomes. For example, if a development team could continue ignoring coding guidelines that it becomes the norm because no release was affected due to non-compliance with the guidelines

Diane Vaughan developed this theory when she was investigating NASA’s space shuttle Challenger accident in 1986.

On observing a pattern of repeated evidence that the officials at NASA chose to ignore despite of the known design flaw with the ‘O-ring’ in the rocket booster.

Vaughan observed: ‘the fact that no consequence resulted from the inaction led to deviance becoming normalised within the NASA culture.’

This tragedy was not to be blamed on one person or one decision. It was the flaw that was ‘socially organised and systematically produced by social structures’.

Deviation Spiral clearly explains how small actions lead to bigger results.


When teams continue to ignore the small flaws, overtime they tend to compound, resulting in the product launch failure.

Outcome Bias

Author/Copyright holder: Jack Hagley. Copyright terms and licence: All rights reserved.


“We will try this new feature because our competitors saw good results with a similar feature too.”

Rather than critically analysing the process taken to arrive at a decision, we use the outcome to decide the quality of the decision made. We think if we were successful last time by following a process, we will be successful the next time too with the same process.

‘We tend to make decisions based on the past results, rather than judging the process of the result.’

Previous instances of foam damage hadn’t created any issues in US space shuttle history. The Columbia space shuttle team assumed that this would be the case when they let the shuttle come back in spite of known foam damage.

In the case of product management, when managers take decisions based on ‘gut instinct’ for example, it’s possible that they’re going in the opposite direction to what his team is trying to convey and what the analytical insights shows. If the outcome of the decision is positive, the team might start considering the manager’s gut feeling and the entire project will be a gamble of “gut feelings” and “instincts”.

Whereas if the decision results in a poor outcome, the manager and team will be more inclined to review their decision making process and may tend to favour data and team discussions.

How to stop making wrong decisions based on these biases?


Biases are natural. Not all biases result in poor decision making. We develop mental shortcuts to make our thinking process faster by categorising events, people and problems.

Wrong decisions are made because of continuous irrational and systematic errors in thinking. Poor company culture and inefficient processes let these biases continue.

It’s important to recognise biases.


Here are 3 simple ways to avoid some of the biases we highlighted above:  


1. Create psychological safety in the team

  • Create a safe space for team-mates where they feel questioning status quo and thinking is accepted.
  • They should feel confident when speaking up about what they perceive to be wrong.
  • When team members openly share opinions, the team can take in diverse views into account, clarify them and as a result make better decisions.
  • Teams think more critically when there is psychological safety, and make more rational decisions overall.  They will tend to address serious issues that would otherwise have been hidden.

2. Play a Devil’s Advocate during team discussions

  • Create a culture where opposing views are a crucial part of decision making.
  • Get an outside-in perspective. Get a person from another domain or team to view the problem or project with a fresh set of eyes.
  • When preparing for a product launch, think of why your launch might fail. List down all the possibilities of its failure.
  • Retrospect as thoroughly on a successful product release as you would on a failed one
  • Look for data insights that contradict your beliefs.


3. Focus on the process

  • Focus on each step of the process. Don’t only focus on why and what. But also, how the work is done.
  • Critically analyse and optimise each step of the process actively, while looking for potential loopholes
  • Right processes, followed properly will assure a reasonably good outcome.

Resources to learn more about cognitive biases:


Always be curious! And keep asking questions.

Case Studies:
Columbia space shuttle disaster
Lollapaloosa Effect
Bay of Pigs
How a writer with Ph.D in psychology became a poker champion

Books:
The art of thinking clearly, Rolf Dobelli
Thinking fast and slow, Daniel Kahneman
Predictably Irrational, Dan Ariely

If you liked this blog, you might also like Experimentation to make data-driven decisions as Product Managers.


ACKNOWLEDGMENT

Thank you Bhavya for sharing your knowledge with the HelloMeets community and helping me perfect this blog post.