The internet has changed our lives in many ways, sometimes for the better, and sometimes for the worse. What the events of last year, such as Brexit and the US Presidential campaign, have exposed is the way that the structural flaws of the human brain can be taken advantage of using the internet.

What is encouraging is that these problems have gotten wider play and people are starting to take measures to combat them. Luckily, we can already build upon a wide range of research and solutions in this area, many of them developed because of similar large-scale failures of human rationality.

One of the most basic principles of traditional economics is the assumption that humans are rational actors always striving to maximize their own benefits.

I remember sitting in Econ 101 class and sort of scratching my head at this. In my experience, most humans were very far from rational actors. Actually, I have seen people literally shoot themselves in the foot more times than I can count.

Then the economic crisis of 2008 arrived and all these theories came crashing down. Most economists realized that humans are not so rational after all, and behavioral economics suddenly exploded onto the scene.

However, behavioral economics is nothing new and had been around for a while, but it was not really the prominent paradigm for most economists.

Already in the 1950s, Herbert A. Simon, an economist and psychologist (a polymath really), proposed the idea of “bounded rationality”. In this model, humans are only partially rational, and this rationality is limited by difficulties in formulating complex problems and in processing different types of information.

Many of the basic concepts in behavioral economics are based on the work of two Israeli psychologists, Amos Tversky and Daniel Kahneman. They noticed that there are specific patterns inherent in human decision-making.

The two big terms that they came up with are “heuristics” and “cognitive biases“. Heuristics are mental shortcuts that humans take in order to solve problems and then take action based on these solutions.

Most of the time, these mental shortcuts come up with a good solution. They are fast, efficient and effective. This then allows you to go about your daily life and navigate the world.

However, things are not perfect and sometimes these mental shortcuts come up with the wrong solution. This is called a cognitive bias. The process was fast and efficient, but obviously not effective.

Why do you behave in this way? All of this can be explained using evolutionary psychology.

Humans, just like any organism, are the product of millions of years of evolution. Their bodies and minds adapted due to pressures coming from their environment.

Not in a direct way, but indirectly. Due to random mutations, certain ways of behavior would arise. If these behaviors allowed the organism in question to survive long enough to reproduce, then they would propagate to their descendants.

This is the basis of evolutionary theory. An individual doesn’t have to be perfect. It just needs to be good enough to survive.

There are two main goals hardwired deep down into your brain: survival and reproduction. These two things are what drives your existence.

Imagine the everyday environment of your ancient ancestors, living somewhere on the vast savannas.

There are many dangers present. At any time of the day or night, a saber-toothed tiger could jump out of the bush and want to eat you, or a little snake could bite you and poison you.

In order to survive in such an environment, you need to be able to take in vast amounts of information through your senses, determine what is significant and what isn’t, and then make quick decisions based on this.

Is that noise you are hearing just wind beating against the sand or something more sinister? Is that shadow just a figment of your imagination or a lion heading your way?

These were the main things that your brain developed to analyze. However there are some constraints to this entire process.

The two basic principles behind the way your brain works are speed and efficiency:

1) You need to make quick decisions based on outside stimuli. So speed is important.

2) You should not expend too many resources, and so you need to do things in the most efficient manner possible. You never know when and from where your next dinner will come from, so saving energy is a priority.

In order to make decisions quickly and efficiently, your brain developed mental shortcuts. That’s where heuristics come from.

However, if your brain developed to make decisions to ensure your survival, why does it fall for cognitive biases? The answer here is costs.

By saying costs, I mean the potential pay-out of making the right and wrong decision.

Let’s go back to the prehistoric savanna in order to illustrate. You are walking on your way home from a successful hunt, your kill strapped to your back. You are walking alone, since you got held up and your companions went on ahead of you.

Then suddenly you hear a noise coming from behind a rock nearby. Quickly, your brain goes into overdrive. What could it be?

If it judges it to be nothing of concern and you pay no attention to the noise, but suddenly a lion jumps on you, you are dead meat. You have become a source of protein and essential fats (and maybe some carbohydrates too) for an entire lion family.

Hopefully, you die quick and don’t get to experience the joys of getting dismembered by a pride of lions, the alpha male and his harem of females, together with their baby cubs.

However, if you judge it to be danger and take out your spear while running away from the place, but then realize that it was nothing, there is no mortal cost to you. Sure, you got a bit sweaty, but you are still alive to hunt another day.

You see what I am getting at? It is much more costly if you judge something to be of no importance, when it is in fact significant, than it is if you judge something significant and there is nothing there.

In the first instance, you die, while in the second one, you continue on living and potentially pass on your genes. Your brain is wired to err on the side of caution.

This is where cognitive biases come from. Your brain evaluates thousands of stimuli from the outside every second, and sometimes it makes mistakes.

However that mistake doesn’t matter if it doesn’t kill you. What matters is that that specific way of solving problems and making decisions keeps you alive the rest of the time.

The brain looks at these different pieces of information and tries to make patterns out of them. If a wrong pattern emerges, it is called a cognitive bias.

One thing that I have noticed is that since now the study of cognitive biases is very popular, psychologists keep on coming out with newer and newer types of these almost every day. Sometimes many of these are very similar.

It’s hard to keep track of all of them. So I decided to sit my ass down and simplify things for myself. At the end, I came out with a small framework to help me make sense of cognitive biases.

I have based this framework on some of the things discussed above. Survival and reproduction are the two main goals that drive human existence.

Heuristics or mental shortcuts developed in order to help humans survive long enough to reproduce. These mental shortcuts need to be fast and efficient.

The inputs they work with are different types of information. This information comes from outside stimuli gathered by your senses or from storage (memories) in your brain.

The problem is that you need to construct all this information into some sort of meaningful patterns. Only once you have this meaning, can you make decisions.

I have broken down all these different cognitive biases into two basic categories arising from two fundamental ideas:

1) The world is centered around me.

2) I need to make the correct decision based on the information available.

The first fundamental idea is that you think that the world is centered around you. No matter how altruistic you are, there is still at least a bit of solipsism inside of you.

This idea shapes your inner thought patterns and your relationships with the people around you.

The second fundamental idea comes from the fact that you need information as inputs in order to make a decision.

Sometimes there is a lot of information around you and you need to determine which of it is significant. Sometimes, there is not enough of it and you need to determine what to do in the face of uncertainty.

Oftentimes, the relevant information may not be present at that moment in your environment, and you might need to pull it out of your memory.

Almost all of the main cognitive biases work within these two categories.

However, you also need to keep in mind that you cannot put all the biases into neat little boxes (otherwise you would be committing a cognitive bias ☺ ). Some biases belong in several categories and combine elements of each.

Here is the framework broken down and explained:

1) The world is centered around me.

Think about it, if you are like most people, your world-view is based around several areas of concern.

First on the order of importance is yourself, then your close family, then your far family, then your group and so on. A good way to explain why this happens is using the selfish-gene theory, where a gene pushes for its own propagation.

This theory not only explains why you yourself want to reproduce, but why you can often also act altruistically towards your family members. After all, they also share similar genes to you.

There are two basic divisions within this category: how you behave inside your group (in-group) and how you behave as a member of your group towards other groups (out-group).

In the in-group situations, you behave in a status-seeking way, since higher status also usually means better access to resources, which means better chances for survival and reproduction.

In the out-group situations, you try to promote the survival of your group versus all the other groups.

Why does this have an impact on cognitive biases?

Cognitive biases are the result of a process that tries to help you survive in this world. One of the most important functions, even among more primitive animals such as lizards, is status seeking.

With higher status, you get access to more resources and thereby you are more likely to live longer. The brain tries to promote this status drive and one way it does this is by faking confidence.

Almost everyone believes that they are above average. For example, studies of drivers have shown that most of the drivers questioned think that they are above average. If everyone is above average, then who is average?

Cameron Anderson, one of the co-authors of a paper on status and overconfidence, explains:

โ€œOur studies found that overconfidence helped people attain social status. People who believed they were better than others, even when they werenโ€™t, were given a higher place in the social ladder. And the motive to attain higher social status thus spurred overconfidence.โ€

Many of the cognitive biases within this in-group category try to fake confidence. However there are also ones that do the opposite. Why do these exist?

Seeking higher status is a high risk, high reward type of gambit. If you succeed, you get rewarded, but you are also more likely to fail. In many cases you can get killed.

If you look at the behavior of chimps in the wild, some individuals who behave in a status-seeking way, but don’t have the tools to back it up, get eliminated by rivals who gang up on them.

So in some cases, trying to preserve the status quo and not engaging in rash actions is the better strategy to ensure your survival.

For example, some cognitive biases promote the lowering of your confidence. In those cases, the individual believes that they are not worthy (such as impostor effect) and so tries to keep quiet and not do anything that could get them eliminated.

He will not get access to all the resources of the higher status individuals, but in many cases, he will be able to ensure his survival and maybe even get to mate with one or two females who sometimes pass their time on the edges of the group.

The type of strategy you adopt, will depend on the personality (some people are naturally more confident than others), but also on the situation. So you might show overconfidence in one situation, but a lack of confidence in another one.

This, in my opinion, is also the basis of prospect theory, where people make decisions based on potential losses and gains.

When engaging in risk-seeking behavior, the potential is higher, but also the uncertainty is higher. If you decide to challenge the alpha male of the group, you might beat him and gain access to all the females and the food. However you don’t know if you will beat him. If you lose, you will most likely be killed or banished.

While still status-seeking, most people will end up behaving in more risk-averse ways most of the time. Think about it, unless you are down in the gutter, you already have a certain amount of status, which affords you access to at least limited resources.

Most of the time, you are not going to want to risk this and so won’t challenge the status quo. If you take a risk and try to go up in status, you might lose even the things that you have now, and so be worse off than before.

If you are engaging in risk-averse strategies, you won’t improve your lot in life, but the certainty factor is higher. You will never have access to a harem of females and the best food, but you might snag the odd female hanging out on the outskirts of the group from time to time, and also you will have access to at least some food. So you have a higher likelihood of surviving longer.

Some of the biases work towards pushing you to seek more status, while others work towards you behaving in a more realistic way, not acting too brashly when the risk is high and preserving the status quo.

The choice of whether you will employ risk-seeking or risk-averse strategies depends on your personality, but also on the situation. Sometimes you might employ status seeking strategies (risk seeking) and sometimes decide it is better to be realistic and minimize risks (risk averse).

Status seeking is much more prevalent among males than females, which is probably due to the different mate-seeking strategies that males and females employ.

Females have to carry a baby inside their wombs for 9 months and then care for it, while the males don’t. This then has direct implications on their behaviors.

Studies have shown that the prevalence of certain cognitive biases differs among males and females (for example, a higher percentage of females suffer from impostor syndrome than males), but all the different biases are present among both.

This is probably due to the fact that for both, males and females, survival is of the highest priority, and ego-stroking (and ego-inhibition) is a key part of that.

Now let’s make some sub-categories of biases that go into this category:

In-Group: You have an ego.

First off, let’s start with your in-group behavior.

Humans throughout history have largely tended to live their entire lives in small groups that would be a maximum of 150 to 200 individuals. This is where Dunbar’s number comes from.

They would share certain kinship within this group, but they would also compete within the group in order to get better access to resources.

The best way to consistently get more and better resources was if you were a high-ranking member of the group. So the drive for status within the group was strong.

There are different strategies in order to achieve higher status, but each of these strategies involved a certain amount of risk.

Your ego is the mechanism that evolved to push you towards aiming for higher status. You feel good if you have it and the ego chides at you internally to prod you to strive for more.

The basic set-up is a preference for a rise up in status. However this is risky and sometimes your ego might pull you towards contests that you can’t win.

To prevent that, there is also a mechanism in place to lower your ego.

The basic dichotomy at play is: boost the ego vs. keep the ego in check.

This plays out in various survival strategies that you can adopt: preference for rising up in status vs. preference for status quo.

Sometimes these two mechanisms play separately. Sometimes these are mixed. As when you feel that your opinion is threatened, it could kill your status quo. However if you instead buckle down and hold stubbornly to your opinion, not only do you keep the status quo, but maybe you even feel a boost in status.

A large number of the cognitive biases that humans fall for, are parts of these mechanisms.


Biases that boost your ego and drive up your preference for rising up in status:

Name of bias: self-serving bias
Description: Self-serving bias is when people tend to attribute positive events to their own character, but attribute negative events to external factors outside their control.
Reasoning (why put in this category): This is all about boosting your ego and self-esteem. When you attribute positive events to your own doing, you highlight your own positive qualities. However, by blaming your failures on external factors, you preserve your ego by saying that there is nothing you could do about it. This is related to other similar biases, such as the fundamental attribution error, or group biases such as group attribution error.

Name of bias: confirmation bias
Description: People have a tendency to search for information that fits their preconceived notions and discount information that doesn’t.
Reasoning (why put in this category): If you believe you are right, that means you feel good about yourself and want to prove that. It reinforces your sense of superiority and thereby gives you at least an illusory status boost. Sometimes contradictory evidence paradoxically even boosts your beliefs, resulting in the backfire effect. After all, you had invested so much into believing that those things are true, that if they were in fact proven untrue, you would lose a lot of status.

Name of bias: Dunning-Kruger effect
Description: This is when underqualified individuals believe they are the shit and overestimate their abilities.
Reasoning (why put in this category): This gives a fake boost of self-confidence to people and helps them rise up in status.

Name of bias: IKEA effect
Description: This is when people put a lot of value to things that they constructed themselves.
Reasoning (why put in this category): The fact that you constructed something yourself serves as evidence that you have skills, which gives you a confidence boost. As I have written in the articles on what you can learn from the chimps, one of the three main strategies that chimps vying for status use is intelligence (by being ingenious), and also the ability to manufacture tools usually boosts their status (alphas usually excel in this activity more than others).

Name of bias: illusion of control
Description: People like to discount the role that luck plays in their lives and instead overestimate the amount of control they have over things that happen.
Reasoning (why put in this category): If you believe that you have control over things, it boosts your drive to perform. As studies on the growth mindset showed, people who believe that if they work harder, they will get better and smarter usually tend to do better than people with a closed mindset. Some researchers have also argued that the illusion of control is linked to the lack of information problem. With limited information on how people will act in the future, the illusion of agency can be a tool to help you predict the intentions and behavior of others in the future.

Name of bias: overconfidence effect
Description: This is when a person believes that their judgment is more accurate than it actually is.
Reasoning (why put in this category): This of course promotes a person’s over-confidence.

Name of bias: reactance
Description: This is a sort of rebellion against doing what you feel someone else is forcing you to do. They tell you to do this and you do something else.
Reasoning (why put in this category): In this type of situation, you feel as if someone is trying to limit your freedom. You feel as if they are trying to control you, which then puts down your status. You react by doing the opposite, in order to prove to yourself that you still have your freedom to act.

Name of bias: Semmelweis reflex
Description: This is the tendency to reject new evidence that contradicts your established beliefs or paradigms.
Reasoning (why put in this category): This bias is named after Ignaz Semmelweis, who discovered that if doctors washed their hands, the incidence of child deaths in his hospital would be virtually eliminated. This proved true, but the doctor’s working in his hospital rebelled against him, and he ended up in a lunatic asylum. By stating that the doctors were doing something wrong, Semmelweis directly threatened their livehoods and status. Even if he was right, he ended up getting persecuted. When a new paradigm starts emerging, it directly threatens all the people who spent their lives working under the old one. That’s why the reactions of these people can often be quite harsh.

Name of bias: sexual overperception bias
Description: The tendency, especially among males, to overestimate their desirability to the opposite sex.
Reasoning (why put in this category): The main goal of life is to reproduce and this bias plays with that. If a guy feels that the opposite sex desires him, he is more likely to approach and thereby increase his chances of reproducing and passing on his genes.

Name of bias: social comparison bias
Description: This is when you dislike someone who you subconsciously feel is physically or mentally better than you.
Reasoning (why put in this category): This is quite natural, since if someone is in fact better than you, then that threatens your social status. This type of thing happens for example in the jobs world, when a manager hires someone who is not the best fit, but does not threaten his position, instead of someone who is the most qualified, but could threaten the manager’s position.

Biases that encourage you to keep the status quo or lower your ego:

Name of bias: status quo bias
Description: This is when you prefer things to stay the same.
Reasoning (why put in this category): This can happen, when you feel relatively happy with your position in life or feel that there are too many risks associated with change, and so prefer for things to stay the same. You might not rise up in your status, but you won’t go down either.

Name of bias: endowment effect
Description: This is when people value something more because they own it.
Reasoning (why put in this category): When you own something, you have possession of it. You are sure that you have it. Losing this item might give you negative benefits. You also often form attachments to things and they become a part of your identity. Losing them would strike at your self-worth and lower your ego.

Name of bias: loss aversion
Description: This is when a person prefers not losing a certain amount over gaining the same amount.
Reasoning (why put in this category):If you have something, you are certain that you will gain at least a limited amount of benefit. If you lose it, you will lose that benefit. This is preferable to you over gaining additional benefit. A good proverb to illustrate: “A bird in the hand is worth two in the bush“.

Name of bias: pessimism bias
Description: This is when people exaggerate the chance of bad events happening to them.
Reasoning (why put in this category): This helps put a clamper on risk-seeking behavior.

Name of bias: impostor syndrome
Description: This is the belief that many qualified people have that they don’t belong and that they got where they are at by chance and will be exposed soon.
Reasoning (why put in this category): This probably evolved to dampen a person’s drive and ambition. If in fact they are in a position for which they are not qualified for, the best strategy is to lay low and not rock the boat. Helps preserve the status quo and avoids danger and risks.

Out-Group: You are a social animal.

Now that we have these biases out of the way, let’s get to another category of biases, the out-group ones.

Humans are social animals and as such need to belong. Ostracism from a group in prehistoric times would often mean death and that’s why humans banded together in groups and individuals strove to belong to them.

This type of need led to the creation of strong group bonds, but also played against the fact that your group was not the only group in the area.

There were often other groups nearby, and many of them were not friendly. You needed to battle them for territory and resources.

These types of situations promoted the rise of different types of cognitive biases. Notice that many of the biases in these categories are also linked to the limited amount of information category (including things like bandwagoning, halo effect, stereotyping and many others).

Biases that boost your connections to your group:

Name of bias: ingroup bias
Description: This is where you prefer people from your own group over people from other groups.
Reasoning (why put in this category): This is a very classic in-group vs. out-group dichotomy.

Name of bias: bandwagon effect
Description: This is when people start believing things just because other people believe them. All kinds of different fads start getting popular with people, which influences more and more people to like them.
Reasoning (why put in this category): Different biases like bandwagoning, herd behavior, groupthink and others arise from the need of humans to conform to the group.

Biases that boost your connection to leaders:

Name of bias: authority bias
Description: This is when you believe what an authority (a person of power or influence) believes and ascribe greater weight to their opinion.
Reasoning (why put in this category): You can see this happening all the time, including recent elections. People are always looking for a Messiah or a savior. This guy is an “authority”, he must know what he is talking about!

Name of bias: halo effect
Description: This is when you form a positive opinion of a person based on something and then this opinion spreads on all aspects of that person. They can do no wrong.
Reasoning (why put in this category): This type of an effect helps spread positive feelings about a person and sort of shut down the logical part of the brain.

Biases towards other groups:

Name of bias: stereotyping
Description: This is when you paint all the members of another group with the same brush and think that they all have the same characteristics.
Reasoning (why put in this category): This is a classic out-group bias, tending to stereotype other groups. It is also related to the limited information problem and it is a risk averse strategy. You are trying to preserve your own group and by being wary of other groups, you tend to keep your guard up.

Name of bias: not invented here effect
Description: This is the tendency to reject things or products that were not created by your group.
Reasoning (why put in this category): Groups have a group pride in things that they invent and tend to shun things created by other groups. The effect is similar to the IKEA effect, but instead to individuals applies to groups. If you have worked on software projects, then you are very familiar with this. Instead of reusing already existing tools created by other teams, software teams tend to prefer to reinvent the wheel and create their own version.

Name of bias: group attribution error
Description: This is where you believe that the characteristics of an individual reflect their entire group. This is very similar to the stereotyping bias, but instead where that one goes top down, as in you having a general stereotype of a group and then pass this stereotype onto any individuals from that group, this one goes bottom up, where you encounter an individual from a particular group and assume everyone from that group has their characteristics.
Reasoning (why put in this category): This is a way to judge what other groups are potentially like and is also related to the limited information problem.

Name of bias: ultimate attribution error
Description: This is where you explain your own behavior based on circumstances, but the behavior of others based on their characteristics. For example, if you observe someone being rude, you think that they are a rude person in general, and don’t take into consideration that their behavior might be the result of some very particular circumstance (like someone calling them names for no reason).
Reasoning (why put in this category): When you see people you don’t know, you need to make an assessment whether they are a threat or not. Since you have limited information, you tend to judge on that particular point of interaction and generalize. So this bias straddles both the out-group and limited information categories. This is also related to the group attribution error as you might then generalize and pass your judgment onto that person’s entire group.

As usual this post is getting too long, so I will tackle the other big category “I need to make the correct decision based on the information available” in Part 2. Just to add, the different categories are something that I came up with. Most of the different cognitive biases seem to fit into them, but there are probably some that don’t. Of course, some of the biases are a combination from the different categories. Trying to fit everything into neat little boxes is probably a bias in itself! ๐Ÿ™‚

Most of the biases above I took from the big Wikipedia list of biases, and a limited amount of them come from other sources.

I came up with the categories by trying to go back to first principles. I sat down and came up with lists of different biases, tried to figure out commonalities and differences. Then I went back and just kept asking questions like “why” and “how” until I came up with the lowest common denominators.

I would like to know your opinion on these divisions. So please share, comment, link and like! ๐Ÿ™‚

Click here for PART 2 of my cognitive biases framework!

—–
Read More:
How to be a critical thinker

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.