If you have studied economics, then you probably came across the core assumption that people are rational actors who make decisions that are in their best interest and after a careful analysis. This assumption always bothered me, since in my experience that’s not how people behaved in real life.

Turns out, I wasn’t the only one who had these doubts. Later I came across an economics theory called behavioral economics. Unlike classical economics which works with the rational actor model, behavioral economics assumes that people are not rational actors and instead fall under the sway of what they call cognitive biases.

One of the fathers of behavioral economics is a psychologist by the name of Daniel Kahneman, who even won a Nobel Prize in Economics for his work. Together with Amos Tversky, they started studying how people reason and why they often tend to make mistakes in their thinking.

They came to the conclusion that there are two basic ways that your brain goes about making thoughts and coming to decisions: System 1 and System 2.

System 1 is quick, heavily dependent on emotions and the subconscious, while System 2 is slow, logical and conscious. System 1 is the one that humans use most of the time and is basically akin to instinct. It evolved millions of years ago in order for your ancestors to be able to make quick decisions in life and death situations.

Luckily, in today’s world, you very rarely face these life and death situations, however you still tend to rely on System 1 thinking even in cases where a more rational approach would make more sense. A lot of times, you don’t even know it.

This often results in you making sub-optimal decisions, which can be a huge problem in many areas of life, including business. One article in McKinsey Quarterly cites the results of a study which confirm this:

Our candid conversations with senior executives behind closed doors reveal a similar unease with the quality of decision making and confirm the significant body of research indicating that cognitive biases affect the most important strategic decisions made by the smartest managers in the best companies. Mergers routinely fail to deliver the expected synergies. Strategic plans often ignore competitive responses. And large investment projects are over budget and over time—over and over again.

How should you minimize these types of failures? If you want to make a better decision, it often makes sense to take a step back and engage System 2.

How do you do this? Together with Dan Lovallo (one of the co-authors of the above cited McKinsey article), Daniel Kahneman came up with a 12-point checklist that you can use before you make any significant business decision.

Here are the 12 points you should look at before you pull the trigger and commit to any type of decision:

1) Check for self-centered biases:

Is there any reason to suspect the team making the recommendation of errors motivated by self-interest? The way to do this is to review your proposal carefully, especially trying to see if it is not too overoptimistic

2) Check for the Affect Heuristic:

Has the team fallen in love with its proposal? The affect heuristic happens when you make a gut decision based on some sort of “feeling” that you have based on an emotion such as love, fear or surprise. In order to avoid this, you should apply quality controls on the checklist in a systematic and rigorous way.

3) Check for Groupthink:

Were there dissenting opinions within the team? Were they explored adequately? Groupthink is a psychological phenomenon, where herd mentality takes over and all dissenting opinions are trampled over. This can often be very dangerous. That’s why you need to take extra care to try to come up with different sides of the issue and solicit other opinions.

4) Check for Saliency Bias:

Could the diagnosis be overly influenced by an analogy to a memorable success? Analogies are sometimes good and sometimes only superficial. That’s why you need to take extra care when using them. Don’t rely on only one, but instead try to come up with several analogies and try to analyze how similar (and different) they are to the current situation. If the analogies don’t fit, then you might have to resort to first-principles thinking.

5) Check for Confirmation Bias:

Are credible alternatives included along with the recommendation? Sometimes people have blinders on and only see the evidence that they want to see and confirms their pre-conceived notions. This makes them discard any information that points to the contrary. In order to avoid this, you should always try to seek out other opinions than your own and request additional options.

6) Check for Availability Bias:

If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now? Availability bias relies on immediate examples that come to mind and sometimes tends to forget that there might be some other things at play. Sometimes it makes sense to draw up standard checklists of the data that you need to have in order to make this decision. If you have these checklists, then it will be easier for you not to forget some crucial piece of information.

7) Check for Anchoring Bias:

Do you know where the numbers came from? Can there be …unsubstantiated numbers? …extrapolation from history? …a motivation to use a certain anchor? Anchoring effect is very powerful and can easily sway the perception of people. That’s why it is often used in negotiations. In order to minimize its effect, you should reanchor your analysis with figures generated by other models or benchmarks, and request new analysis.

8) Check for Halo Effect:

Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another? The halo effect sometimes relies on survivorship bias when examining issues, and doesn’t really take into account other factors. In order to eliminate this, try to come up with a list of the different factors that could be at play and also other examples.

9) Check for Sunk-Cost Fallacy, Endowment Effect:

Are the recommenders overly attached to a history of past decisions? To combat this, always start with a fresh perspective, ask questions and examine the situation from different angles. What type of a decision would you make, if you disregarded the issues arising from past decisions?

10) Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect:

Is the base case overly optimistic? Sometimes you want something to happen, so in your mind you become over-confident and think that the chances of that happening are close to 100%. You also underestimate the challenges in front of you. Everything seems easy. Instead, try to take a different view, focus on the challenges and re-examine your assumptions. Have the team build a case taking an outside view; use war games.

11) Check for Disaster Neglect:

Is the worst case bad enough? Sometimes people are not pessimistic enough with their worst-case scenario. Instead what you need to do is be realistic and plan ahead as if the worst-case scenario already happened. This strategy is called the pre-mortem. In this type of analysis you behave as if the worst-case scenario happened and you develop a story about its causes.

12) Check for Loss Aversion:

Is the recommending team overly cautious? Most people have an aversion to risk. That’s why they tend not to take risky options, even if these have a huge upside. To combat this, you should realign incentives to share responsibility for the risk or to remove the risk in the first place.

quoteb95e07128cdda5142d5805c63ee4be5b

Read More:
How To Be A Critical Thinker And Develop Your Mental Powers

Read More:
How to think like Elon Musk: First principles thinking

—-
Image:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.