You have probably argued with someone whose position did not make sense. Instead of backing it up with facts, this person ended up just spouting the same slogans over and over again in a never ending circle.

We were always at war with Eurasia.

It’s a matter of facts, you would say to yourself. Maybe if they knew the facts, they wouldn’t be saying what they are saying.

After this quiet self-talk, you would then go back, research the shit out of the subject and send a summary of this research to them.

However, contrary to what you expected, they would not buckle down under a barrage of facts. Being exposed to facts just made their weakly argued conviction even stronger. Your strategy backfired.

A few days later they would come back and start saying the same thing you told them, albeit with their own spin, without even acknowledging that it was you who told them this in the first place.

We were always at war with Eastasia.

Or they might come back and start shouting their original slogans even louder, thinking it is how loud you shout and not the strength of your arguments that determines who is right.

On the other hand, it is not always others who are the problem. Sometimes you just have to look in the mirror. Maybe it is you who behaves this way. 🙂

What is at work here are cognitive biases. We all fall for them, but some people fall for them in a stronger way than others.

The first step of a recovering addict is to acknowledge that you have a problem. 🙂

Only after you are honest with yourself and face your weaknesses, can the journey towards recovery begin. Without this step, any attempt at a cure will be met with failure.

Once you have faced up to your problem, the next step is to start learning about what it actually is.

What are cognitive biases and why do humans fall for them?

There are different challenges that humans have to get through daily. So different mechanisms evolved in order to make this easier.

One of the ways to solve many of these problems is using heuristics. These are mental shortcuts that humans take in order to solve problems and then take action based on these solutions.

Most of the time, the solutions that these heuristics come up with are correct and you can rely on them.

However, there are times when these heuristics fail and come up with a bad answer, a cognitive bias.

My Framework for Cognitive Biases

I have been reading about cognitive biases for a while now. Ever since Daniel Kahneman’s book “Thinking Fast and Slow” came out, this concept seems to be popping up everywhere and many more popular works have appeared discussing it.

However, being a perfectionist and a guy who likes to put things in boxes, I was missing a more systematic categorization of the different types of biases out there. For me, this type of categorization would help in keeping the discussion relevant for the common folk and extremely helpful if you want to apply lessons in real life.

Not finding anything to satisfy me, I decided to come up with my own framework for cognitive biases. I thought back and tried to reduce all the different biases to their first principles and work up from there.

I looked at some of the basic similarities and differences between the different types of biases and came up with some initial categories.

Why and how did these biases evolve in the first place? Here, evolutionary psychology can shed a light.

Evolution does not come up with perfect solutions, only with solutions that are viable enough to survive.

The drivers for every living thing are survival and reproduction. This happens in a very complex outside world where many dangers (but also opportunities) are present.

So heuristics (and cognitive biases) evolved in order to promote survival. Due to the fact that your environment is so complex, you need to have the ability to analyze it and then make a decision on what to do next.

The two basic principles behind the way your brain works are speed and efficiency:

1) You need to make quick decisions based on outside stimuli. So speed is important.

2) You should not expend too many resources, and so you need to do things in the most efficient manner possible. You never know when and from where your next dinner will come from, so saving energy is a priority.

In order to make decisions quickly and efficiently, your brain developed mental shortcuts. That’s where heuristics come from.

However, if your brain developed to make decisions to ensure your survival, why does it fall for cognitive biases? The answer here is costs.

By saying costs, I mean the potential pay-out of making the right and wrong decision.

Imagine yourself walking along a path with bushes all around you. You hear a sound. It could be anything really.

However you search back and it reminds you of the sound a lion makes. You decide to start running away from the place as fast as possible.

Turns out it was a false alarm. Your brain connected the dots, but in fact it was a false pattern.

No harm done. You are a bit sweaty and tired, but you are still alive.

Now imagine yourself walking again along the same path. You hear a sound.

This time you decide not to run and just stay there. Then suddenly a lion jumps out of the bushes and kills you. You are dead meat!

You failed to connect the dots and ended up as lunch.

You see what I mean by “costs”? The cost of making a false pattern meant that you just got a bit tired and sweaty. The cost of not making a pattern when in fact there was a pattern meant you ended up dead.

When it comes to heuristics, the cost of a cognitive bias is much less than that of not making one. Sure, you might be wrong, but at least you are still alive.

Your brain is wired to err on the side of caution.

This is where cognitive biases come from. Your brain evaluates thousands of stimuli from the outside every second, and sometimes it makes mistakes.

However that mistake doesn’t matter if it doesn’t kill you. What matters is that that specific way of solving problems and making decisions keeps you alive the rest of the time.

The different types of cognitive biases

I have broken down all these different cognitive biases into two basic categories arising from two fundamental ideas:

1) The world is centered around me.

2) I need to make the correct decision based on the information available.

The first category is based on the idea that people are selfish. If you read “The Selfish Gene” by Richard Dawkins you know what I am talking about.

Your genes want to survive and reproduce themselves, and that’s why you do too.

There are two basic divisions within this category: how you behave inside your group (in-group) and how you behave as a member of your group towards other groups (out-group).

You want yourself to survive, but you want your family members to survive too (since they share the same genes).

In the first sub-category you are behaving in a status-seeking way, while in the second you promote the survival of your group.

There are different types of strategies that will promote your survival or rise in status. These strategies are either risk-seeking and stroke your ego, or risk-averse and keep your ego in check.

Category 1: The world is centered around me.

Sub-category 1: I have an ego

This category is about how you behave inside your social group.

Biases that boost your ego and drive up your preference for rising up in status:

Let’s go back to the example I gave at the beginning of this post. The guy got exposed to facts, but instead he kept on believing his BS even stronger. This is called the backfire effect.

If he had conceded that he was wrong, that would have lowered his self-esteem and he would have fallen in status. Instead, the backfire effect worked in a way to stroke his ego.

Here in this category you have biases like: the confirmation bias, backfire effect, Dunning-Kruger effect, IKEA effect, illusion of control, overconfidence effect, reactance, Semmelweis reflex, sexual overperception bias, social comparison bias, and many others.

Notice how they all stroke your ego and try to boost up your own social status (at least in your mind) in comparison to others.

Biases that encourage you to keep the status quo or lower your ego:

However this is not always the best strategy. The risk here is high and it could get you killed. If you feel overconfident and go into a battle you can’t win, you are very likely to get killed.

So there are some cognitive biases that instead try to keep the status quo and lower your ego. This is the low-risk strategy. Sure, you might not get extra benefits with this strategy, but at least you keep yourself alive.

Here in this category you have biases like: status quo bias, endowment effect, loss aversion, which are biases that try to preserve the status quo and shield you from any potential losses, while biases like pessimism bias, or the impostor syndrome work even more directly on your ego by lowering it.

Sub-category 2: I am a social animal

This sub-category is about how you behave as a member of your group towards other groups. You are a social animal. You need a group to survive.

Biases that boost your connection to your group

A human usually cannot survive on their own, but needs a group in order to do that. There were some biases that arose because of this. These behaviors boost your connection to a specific group, but also make you conform to it.

These include: ingroup bias, bandwagon effect, herd behavior, groupthink.

Biases that boost your connection to leaders

Human groups are usually hierarchical, with dominant individuals being the leaders. One of the goals of a leader is to have their group survive. In order to do that, they need their subordinates to do what they tell them to do. There is a survival-value to obeying the orders of leaders, so there are some cognitive biases that probably arose in order to boost these connections.

These include: authority bias, halo effect.

Biases towards other groups

Oftentimes, survival was not a pretty thing and your group had to battle against other groups for territory and resources. When you saw another group, that meant danger, so the default setting was “danger”, until proven friendly. This is also linked to the lack of information problem that I describe further down.

The out-group biases include: stereotyping, not invented here effect, group attribution error, ultimate attribution error.

Category 2: I need to make the correct decision based on the information available.

Any decision you take is based on some sort of information. This information also has to come from somewhere, so it is either there present in the environment, or it isn’t and you need to find it from somewhere else.

The brain uses different types of inputs as information in order to create patterns, and then get meaning out of these patterns. These processes are the source of many cognitive biases.

Sub-category 1: There is information available.

A lot of times there is information present all around you, but there are two headaches associated with this: either there is too much of it, or there is too little of it.

Headache 1: There is too much information present in the environment.

Every second of every day, you are bombarded with information. You are walking to work, and all around you there are buildings, signs, people, discarded cigarettes, sidewalks, clouds, flowers, uncovered potholes, different types of noises and all kinds of other things.

Most of these things really don’t matter that much for you. However, in midst of all this “information noise”, there could be some pieces of crucial information that you need. The key is to find them.

Your brain is wired in order to block out all this “noise” and instead focus on just a few things. Unfortunately, this tendency can also lead to different types of cognitive biases.

Here you can include these types of biases: attentional bias, curse of knowledge, focusing effect, Forer effect, identifiable victim effect, pareidolia, selective perception.

Notice how all the biases listed above have to do with picking a single thread out of a huge variety of information.

Many (but not all) of the cognitive biases to do with statistics and base rates fall under this category as well, including: base rate fallacy, conjunction fallacy, insensitivity to sample size, money illusion, neglect of probability.

Headache 2: There is too little information present in the environment.

Most of the time, you are forced to make a decision based on incomplete information. When you don’t have all the required information to be 100% sure, your brain gets to work.

It starts making connections between the information that you do have and tries to fill in the gaps.

The key role of your brain is to find meaning, and when information is missing, it needs to work through creating patterns, using analogies, and fill in the missing information with something plausible.

As we all know, frequently what the brain often fills in is not real. This results in different types of cognitive biases.

Among these are: ambiguity effect, anchoring effect, availability heuristic, bandwagon effect, framing effect, functional fixedness, fundamental attribution error, gambler’s fallacy, hyperbolic discounting, illusory correlation, information bias, normalcy bias, planning fallacy, projection bias, pseudo-certainty effect, rhyme as reason, survivorship bias, triviality.

Sub-category 2: There is no information available.

Oftentimes, there will be no information available when you are trying to make a decision. You need to get it from somewhere else, like for example your memory.

Headache 3: You need to get things out of your memory.

Your brain knows that information is crucial to your survival, and so tries to store all kinds of information, so that you can access it when the need arises and use later.

However, this process is not perfect (due to all kinds of constraints) and so results in many kinds of cognitive biases.

Among these: change bias, childhood amnesia, choice-supportive bias, context effect, cryptomnesia, generation effect, illusion-of-truth effect, lag or spacing effect, leveling and sharpening, levels-of-processing effect, memory inhibition, modality effect, persistence, picture superiority effect, primacy effect, suggestibility, telescoping effect, testing effect, Von Restorff effect, Zeigarnik effect.

Notice, that some of these memory biases are also linked to some of the other categories that I listed before. For example choice-supportive bias, is not only a memory bias, but is also a bias stemming from your ego, and so belongs in Category 1 as well.

That’s also often the problem with these types of lists and categories. Sometimes, some of the things listed don’t always fit neatly into one category, but instead have the features of two or more.

However, these few basic categories do capture the essence of most of these cognitive biases and do give a better idea of what they are. If they are helpful, then I keep them.

To summarize the categories:

1) The world is centered around me.

Sub-category 1: I have an ego.

Biases that boost your ego and drive up your preference for rising up in status.

Biases that encourage you to keep the status quo or lower your ego.

Sub-category 2: I am a social animal.

Biases that boost your connection to your group.

Biases that boost your connection to leaders.

Biases towards other groups.

2) I need to make the correct decision based on the information available.

Sub-category 1: There is information available.

Headache 1: There is too much information present in the environment.

Headache 2: There is too little information present in the environment.

Sub-category 2: There is no information available.

Headache 3: You need to get things out of your memory.

Basically what you need to remember are the two big basic concepts: that you have an ego, and that you need information to make a decision quickly. This is the basis of the vast majority of cognitive biases out there.

If you want a simple way to remember the basic types of biases, then just keep in mind these simple phrases:

I am a douchebag.

That’s for all the biases which are based on ego stroking.

I am a loser.

That’s for all the biases that keep your ego down.

My group is the best and all the other groups suck and are out to get us.

That’s for all the social animal biases.

Sometimes there is too much info, sometimes too little, and sometimes none and I have to use my faulty memory.

That’s for all the information-based biases.

Keep in mind that the categories are not always clear-cut. Most of these biases could fall into several categories. Confirmation bias helps to boost your ego, but it also helps with decision making and picking out a piece of information out of a myriad of competing pieces of information. Herd behavior works on boosting your connection to your group, but in many cases can also help in making a decision when information is not available.

All this stuff gives you the basics on heuristics and cognitive biases and what they are.

However, if you want to get more details on the categories above, as well as my reasoning, you should read the two original articles on my Cognitive Biases Framework (warning: long and dense read):

Cognitive Biases Framework Part 1: Ego-Based Biases

Cognitive Biases Framework Part 2: Information-Based Biases

Read More:
How to be a critical thinker

This checklist will help you make better decisions and avoid cognitive biases

Beware of Advice: what can you really learn from successful people?

One thought on “How To Think About Cognitive Biases: A Short Summary Of My Cognitive Biases Framework”

  1. In case you didn’t get the “war with Eurasia and Eastasia” references, they come from George Orwell’s novel “1984”. Read the book to get the context. 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.