It’s another great day on the savannah. You wake up as the sun is beginning to rise up above the horizon. A cool breeze starts blowing in from the east.

You get all your tools ready as you start preparing for another day of tracking and hunting on the vast grasses of the East African plains.

This probably would have been a typical scene for you, had you been born about 200 thousand years ago.

You would have had to struggle every day in order to survive, by finding food, battling enemies, and generally staying out of harm’s way.

Life wasn’t easy in those days. You were not guaranteed to make it to the next day, so you had to hustle hard and always be on alert.

Luckily, you had inherited a set of internal tools embedded deep in your brain that are the result of millions of years of evolution.

Those are your best bet for surviving and prospering. They drive you to want to achieve more, to struggle on, but also to take in all the different information coming into your brain from your senses and then make a decision.

These decisions need to be made quick and in an efficient manner. They could mean life or death and plus you don’t have too much energy to spare.

The internal tools I am talking about are called heuristics, shortcut ways to coming up with a solution to a problem and then acting upon it.

Psychologists and researchers have started to study them more intensely in the past few decades and have drawn up quite a list.

I had already discussed how I sat down and decided to make sense for myself of all these long lists of heuristics and the cognitive biases coming out of them.

Yes, these heuristics come up with good solutions most of the time, but can fail spectacularly as well some of the time. These failures result in cognitive biases.

In Part 1 of the Cognitive Biases Framework, I discussed the first category of biases that I found, ones I grouped under the “The world is centered around me” problem.

I further divided up this category into two main sub-categories: “I have an ego” and “I am a social animal”.

Yes, you do believe that the world is centered around you, and you do have an ego. And yes, you are a social animal. 🙂

The cognitive biases in these categories are meant to manage your drive for survival and reproduction based on potential risks.

2) I need to make the correct decision based on the information available.

However in order to survive and reproduce, you need to be able to manage all kinds of different information, interpret it, and then make a decision on a future course of action based on this.

Let’s get back to a prehistoric nature scene.

What do you see? What should you be seeing? Where is the danger? Where is the potential food?

Information overload! Too much information!

Wait, do you hear that sound? What could it be? Should you be scared? Is it nothing? Too little information!

You try to search in your memory. Yeah, you heard that sound before. Probably nothing.

What happened? You were bombarded with inputs from your senses and subconsciously a decision needed to be made about which of these inputs are important and which should not be paid attention to.

You focused in on some sounds, but unfortunately you did not have any more information to make any reliable conclusion. So your brain raced to find analogies to that sound in its memory.

Then based on all these things, your brain came to a decision: No danger imminent, continue on.

What made this decision possible? Information!

While the category described in Part 1 was about your relationship with other people, the category described here is about your relationship with information.

The brain uses different types of inputs as information in order to create patterns, and then get meaning out of these patterns. These processes are the source of many cognitive biases.

Why do many of these patterns turn out to be wrong? The answer once again has to do with risk.

Going back to our ancient savannah, imagine walking around in the tall, yellowish-colored grass. You hear a sound, and see the grass in front of you ruffle a bit.

You are missing information. There are two possible courses of action.

Your brain does not form a pattern, does not associate the sound and ruffle with anything and you continue on walking straight. Or, the brain makes a pattern, surmises that it could be a dangerous animal, and you decide to take out your spear and backtrack slowly and carefully.

Choose your own adventure!

What if it turns out that you did not form a link between the ruffle of the grass and anything and continued on walking, but walk straight into the mouth of a hungry lion! Big mistake!

On the other hand, if you formed a pattern and backtracked out of potential danger, you walked out of harm’s way. Even if there was no lion in the grass and the pattern you created was false, you are still alive and this mistake didn’t cost you much.

The risk of not making a pattern and turning into lunch meat is much bigger than making a pattern that turns out to be false.

This is why cognitive biases have a tendency to happen.

There are some basic principles about how you and your brain works in the context of information. These principles have a big effect on how these cognitive biases take place.

Principles:

Your brain tries to find meaning.

Your brain works by forming associations.

You brain works by drawing analogies.

What you see is all there is.

The emotions you feel at the moment have a huge impact on how you perceive a situation.

The way your brain stores memories is not perfect.

It has been said that humans are storytelling animals. Telling stories is one of the most ancient and most popular ways of conveying information. We are all suckers for a good story.

The reason why stories are so powerful is because they connect the dots smoothly. Things happen in sequence and there is always a cause and an effect.

This type of structure conveys meaning very well, and that’s the most important thing humans are looking for: meaning.

What all the different cognitive biases dealing with information have as a common denominator is that they are giving you some sort of an explanation (a meaning) for all the different things that are happening.

This is the basic principle from which all the other principles derive. Your brain tries to determine this meaning by forming associations between different elements.

These associations can then be stored in your memory, so that it can be used at a later date as well. The way your brain is set up is that synapses form between neurons, which then links them together. When they wire together, they fire together too!

Another principle behind how the brain works is that it likes to form analogies. For example if it has a hard question to answer, it might substitute that by forming an easier question which is similar and answering that.

To put all this in context, you need to understand that the brain usually works only in a limited context and largely devotes itself to the present moment (the now).

For the brain, what you see is all there is, which Daniel Kahneman calls the WYSIATI principle.

The future is unpredictable, and in order to get there you still need to deal with everything that is happening now! 🙂

The WYSIATI principle states that the brain deals with its immediate surroundings and what it can get out of them, with things it knows. It doesn’t concern itself with things it doesn’t know.

If it doesn’t know something, it just makes it up. Basically, your brain fills in the blanks with its best guesses.

To make decisions quickly, your brain relies on emotions. So the emotions you are perceiving at the moment have a huge impact on how you react to the situation.

Of course, another important element for any type of decision is the ability to store memories and then recall them as needed. The way this happens in the brain can be quite messy and is another source of cognitive biases.

How to make a decision?

The process to arrive at a conclusion has some basic elements which are similar whether you use a rational approach (what Kahneman calls System 2), or whether you use heuristics (which is the default process for most people and Kahneman calls it System 1).

You start off with a bunch of information. The first thing you need to do is to filter it down and find which information is relevant for the problem at hand.

Many times, the relevant information that you get might not be complete. There could be some information missing and you need to find it. So you go to an outside source to try to do that.

These are the same basic steps that you use for any type of everyday problem, you use heuristics for.

You are walking out on the savannah, getting in all the sights and sounds around you. Then you zoom in onto something on the ground.

You see what looks like footprints and also a pile of dung. This is of course incomplete information. Whose footprints are these? Whose dung? How long ago did this happen?

You access your memory and try to find the answers. The footprints look like those of an elephant, the dung too. An elephant must have passed by here.

Then you remember that there is a water hole nearby. You try to fill in the blanks. Maybe the elephant went there to drink some water?

This little story illustrates two basic things about information: there is information present in the environment or there isn’t.

There is information present in the environment.

When these is information present, however there might either too much information or too little of it.

Both affect decision-making. When you need to make a quick decision, both too much info and too little info can be a hindrance.

Headache 1: There is too much information present in the environment.

When there is information present in the environment, sometimes there might be too much of it.

Going back to our elephant dung story, before finding that dung and the footprints, you were walking around on a path. All around you there was grass, different noises, bees buzzing, little critters scuttering away, birds flying overhead, the wind was picking up, you could see herds of animals in the distance…etc.

All this stuff could easily overwhelm you. There is a law in psychology called Hick’s Law, which postulates that an increasing number of choices increases the decision time logarithmically. You could face paralysis by analysis.

With all that information coming in from the senses, you needed to focus in on what is important and what is relevant for you.

Your brain learned to filter out all the information it deems irrelevant and focus in on the specific. This is what is behind many of the classic cognitive biases.

It is generally agreed that humans are bad at statistics and they for example often ignore the base rates when calculating probabilities.

However when you take some of the basic principles described above, it is pretty easy to understand why this is the case.

In most common every-day situations, the brain does not need to calculate complex probabilities. That would take too much time and effort.

Take the classic Linda problem that Kahneman described in his book “Thinking, Fast and Slow” (and which I described here).

Humans weren’t evolved to be accurate, but instead to judge better whether Linda is a threat or not. So it doesn’t matter if their judgment is accurate or not, as long as it saves their life. The general base rate doesn’t matter, what matters is the specific.

Humans get their inputs from their senses (linked to WYSIATI) and so abstract reasoning was never a very important part of the way to do things, if you are out in the field. The moment you start calculating probabilities is the moment you stop paying attention, and at that moment you are dead.

Another reason why people ignore base rates and go for something more specific? Probably because they can identify better with it, they can picture it, they can fit it in a story. People prefer to think by analogy, and specifics give them analogies.

Biases due to too much information present in the environment:

Name of bias: attentional bias
Description: This is the tendency to pay attention to some things, while not paying attention to others.
Reasoning (why put in this category): You need to make a decision, however there are too many inputs and possibilities out there. So you basically shut down your thoughts on most of the possibilities and instead focus on just a few. This can result in tunnel vision and ignoring good possibilities, but can be helpful if you need to make a decision fast.

Name of bias: curse of knowledge
Description: This occurs when you think the people you are talking to have the same background information and knowledge of the issue as you do.
Reasoning (why put in this category): There is so much information to convey, but you don’t know which is the right information that you need to start with and tell to the other people in order for them to understand. If you are an expert, you have such a deep knowledge of a subject that it might be hard to put yourself in the shoes of a beginner. You assume that everyone has the same view and knowledge of the subject as you do. Also a good example of solipsism (or the world centers around me view).

Name of bias: focusing effect
Description: The tendency to focus on just one aspect of an event.
Reasoning (why put in this category): There are so many things that happen in an event that you cannot grasp them all, so people tend to focus on just certain aspects of that event. That’s why often people can have such different interpretations of the exact same event.

Name of bias: Forer effect
Description: This is when people believe that statements given to them for example in horoscopes (or in a similar way) are true and reflect them, when in fact these statements are just vague and general and could fit almost anyone.
Reasoning (why put in this category): There is shitloads of general info and your brain is trying to make connections to it. It succeeds in finding a potential relevance, even if it is imaginary. How does it find relevance? Take the statement, you have been thinking a lot lately about a special person. All these are vague words, however your brain tries to connect them to particular instances in your own life. The word “lately” could be any period, while “special person” could be anyone really. Maybe last month, you though hard about some girl you like. Bingo! The brain makes the connection between the general statements in the horoscope and that period and that girl and yes the horoscope is true! Amazing!

Name of bias: identifiable victim effect
Description: This is when you have a greater empathy and willingness to help a specific individual rather than a large anonymous group.
Reasoning (why put in this category): Here you ignore the big groups of people who are largely a faceless mass to you and zoom in on the specific individual. Large groups of people are just statistics as Stalin supposedly once said. However when you identify a specific individual, you can relate to them more. This probably is also linked to kinship to your group. Throughout much of history, humans lived in small communities (see Dunbar’s number) where you personally knew the other individuals. However with outsiders you were more distant, grouping them as a mass and forming general stereotypes about them.

Name of bias: pareidolia
Description: A random pattern is perceived as significant.
Reasoning (why put in this category): There is so much information out there, but the brain needs to pick out the relevant information and make sense of it. That’s why it can often overthink things and pick out patterns that don’t really exist.

Name of bias: selective perception
Description: This is when people perceive what they want to hear and ignore other messages.
Reasoning (why put in this category): Too much information out there, so this is a mechanism to filter it out. Similar to confirmation bias.

Many (but not all) of the cognitive biases to do with statistics and base rates fall under this category, including: base rate fallacy, conjunction fallacy, insensitivity to sample size, money illusion, neglect of probability. With these statistics-based biases all the information is there, you are just focusing on the wrong thing. For example with the base rate fallacy you are focusing on a specific piece of information instead of the actual base rate.

Headache 2: There is too little information present in the environment.

Most of the time, you are forced to make a decision based on incomplete information. When you don’t have all the required information to be 100% sure, your brain gets to work.

It starts making connections between the information that you do have and tries to fill in the gaps.

An interesting analogy for how this happens can be found by what happens when some people suffer from an illness that robs them of their sight.

Some people who have total or severe blindness experience visual hallucinations. This is called the Charles Bonnet Syndrome. Reportedly, these people start “seeing” pictures of things like faces, animals or objects, which often appear smaller than in real life and sometimes have cartoonish features.

What happens is as their vision deteriorates, their mind is forced to start filling in bigger and bigger gaps. This then results in these hallucinations. They are not real, but feel real.

The key role of your brain is to find meaning, and when information is missing, it needs to work through creating patterns, using analogies, and fill in the missing information with something plausible.

As we all know, frequently what the brain often fills in is not real. Just like patients suffering from blindness getting cartoonish hallucinations, ordinary everyday people can suffer through all kinds of hallucinations of their own, these being results of cognitive biases.

Biases due too little information being present in the environment:

Name of bias: ambiguity effect
Description: This is an effect which has at its core the lack of information. People choose an option for which the probability of a favorable outcome is known over the option where it is unknown.
Reasoning (why put in this category): Classic lack of information.

Name of bias: anchoring effect
Description: Relying on an anchor, even if it has nothing to do with the problem.
Reasoning (why put in this category): Since there is a lack of information, people pick at straws, and even something which has nothing to do with the problem can sway their answers. Can be good to use in negotiations.

Name of bias: availability heuristic
Description: This is when you rely on immediate examples that you can easily recall from memory as being more likely and ignoring their actual probabilities.
Reasoning (why put in this category): This is a very common effect. For example people hear of plane crashes in the media all the time, and oftentimes come to conclusions that they are very frequent, when in fact they are not.

Name of bias: bandwagon effect
Description: This is when you believe and behave in the same way as all the other people around you.
Reasoning (why put in this category): Animals living in herds often follow what the other animals in herds are doing, since there is safety in numbers. If all the other animals are going left, then there is probably a good reason why they are doing that and so it is wise to go left with them. After all, maybe there are some predators ahead. Also as written before, the bandwagon effect is also linked to trying to fit in with your group.

Name of bias: framing effect
Description: This is when you come to different conclusions based on the way the information is presented.
Reasoning (why put in this category): People tend to be risk-averse and usually shy away from danger and so if a choice is presented where the danger is highlighted, they will behave differently if it is shown in another way, even if the information is the same.

Name of bias: functional fixedness
Description: Doing things or using objects in the same way every time.
Reasoning (why put in this category): Since you don’t really have much time to think about doing things in novel ways every time, your brain has some algorithms implanted that tell you what to do straight away even if it is not the most efficient way of doing things. There could be a better way of doing things, but you don’t know about it and in fact sometimes your brain even shuts off the possibility for it. This also has to do with the way your memory works.

Name of bias: fundamental attribution error
Description: This is where you overemphasize personality factors of others over circumstances.
Reasoning (why put in this category): Here you have a lack of information on others and so need to draw conclusions.

Name of bias: gambler’s fallacy
Description: This is where someone believes that if something happens more frequently than normal during a specific period, it will happen less frequently in the future. Happens frequently with gamblers, hence the name.
Reasoning (why put in this category): This is a statistics based problem and a way for people to deal with the fact that they lack information about the future.

Name of bias: hyperbolic discounting
Description: This is where you have a preference for immediate payoffs over payoffs to happen in the future (even if they are bigger).
Reasoning (why put in this category): You lack information on what will happen in the future.

Name of bias: illusory correlation
Description: Here you draw up an imaginary relationship between two events where none actually exists.
Reasoning (why put in this category): You need to come up with some sort of a meaning, and since you lack information, you fill in the blanks.

Name of bias: information bias
Description: This is when you seek more and more information, even when it has no impact on the decision.
Reasoning (why put in this category): A lot of times people feel that they don’t have enough information and keep on seeking more.

Name of bias: normalcy bias
Description: Here is where you don’t plan for something that has not happened to you before.
Reasoning (why put in this category): You lack information.

Name of bias: planning fallacy
Description: This is when you underestimate task completion times.
Reasoning (why put in this category): You lack information and cannot really predict the future. Related to many other planning biases.

Name of bias: projection bias
Description: This is when you overestimate the chance that you will have the same preferences in the future as you do now.
Reasoning (why put in this category): You lack information on the future.

Name of bias: pseudocertainty effect
Description: If you are risk-seeking to avoid negative things happening, but risk-averse for positive things.
Reasoning (why put in this category): This is one of the many biases that work using the framing effect. If something gets framed in a negative way, you are more likely to perform actions with a higher risk than if it were framed in a positive way.

Name of bias: negativity bias
Description: Many people have a tendency to put a greater emphasis on negative things than positive. When all is equal, negative things have a greater effect on a person’s state of mind and decision-making.
Reasoning (why put in this category): If a negative thing happens to a person, it usually has a greater impact on their well-being than when something positive happens. That’s why most people are hardwired to try to pick out negative signals over positive signals when too little information is present in the environment. This effect is also linked to things like status quo effect or loss aversion.

Name of bias: rhyme as reason
Description: This is when you think something that rhymes is more truthful than a normal sentence.
Reasoning (why put in this category): This is one of the many biases where you are grasping at straws due to lack of information.

Name of bias: survivorship bias
Description: Concentrating on people or things that survived a process over those that didn’t.
Reasoning (why put in this category): You don’t have information on all the people who underwent the same process.

Name of bias: triviality
Description: Focusing on unimportant matters.
Reasoning (why put in this category): Some things might be too hard, so it is more rewarding on things that are easy.

There is no information present in the environment.

What if there is no information present in your surrounding environment? Well, then you will just have to go back to your memory.

Headache 3: You need to get things out of your memory.

Your memory is where all the information that you come across gets stored for later use. The way it works is complex, and several parts of the brain are involved in the process. Researchers are still only beginning to comprehend how it works and there are still many things that we don’t know about it.

The workings of your memory deserve a long article all their own, but anyways here is a short summary. What is important is that this entire process is imprecise and results in many cognitive biases.

The place where information gets stored first is your short-term memory, and immediately after that it gets transferred to the long-term memory for storage. When a memory gets stored in your long-term memory, you sort of forget about it, unless something wakes it up for you.

Your memory works on associations, due to the fact that synapses form between different neurons. The classic experiments on dogs of I.P. Pavlov, show how this happens. The dogs came to associate bells with food and later whenever they heard bells, their memory recalled food and they started salivating.

As mentioned before, the way your memory works is very far from perfect. Many glitches happen along the way. It can store vast amounts of information, but at any moment, you remember only a fraction of it.

Wouldn’t it be good if you could remember everything that happened in our lives? Actually it wouldn’t. There are some rare individuals in this world who a condition called hyperthymesia, which makes them remember everything.

One case is that of a man named Solomon Shereshevskii, who was born in Russia at the end of the 19th century and could remember literally everything. His case was studied by Soviet neuropsychologist Alexander Luria and revealed many interesting aspects of how the brain works.

The fact that you can remember everything means that not only do you remember the good, but also the bad. And this can be painful. What Shereshevskii spent doing most of his life was trying to learn how to forget.

The fact that your memory works on associations is probably what allows this infinite memory. Shereshevskii was experiencing strong synesthesia, where the different senses merge together, so for example when you are tasting something, you can smell it too. If you stimulate one sense, other senses get stimulated too.

If you read the book “Moonwalking with Einstein” by Joshua Foer, where he details his rise to becoming the US Memory Champion, you will notice that many of the techniques that Foer describes for helping him remember better, actually take advantage of the formation of associations and also certain aspects of synesthesia. So a good way to remember things is if you form connections between them (one technique to do that is the memory palace technique).

Since the experience of being able to remember everything would not be such a blessing that it would appear to be at first glance, we are lucky that we have the ability to forget.

However, you still need to be able to access some memories. The mechanism that helps with this is related to associations. Associations help you retrieve other things. It’s a great feature, but one also leading to many cognitive biases.

Other things leading to biases are the fact that the way your brain stores memories is imperfect and many mistakes occur during that process. Watch the video below to find out more:

Not only can your brain stores memories in a wrong way, it can overwrite correct memories and implant false memories in their place. Some people can’t tell the difference and even start believing the fake memories they created! Some liars are so messed up that when they are lying, they actually think they are telling the truth!

The principle of what you see is all there is and the strong preference for the present also play a huge role in cognitive biases associated with memory. The future is unknowable, and the way you remember the past can change based on your present state.

You might have also noticed that many of cognitive biases, for example on outside groups, rely on the formation of generalities (which is the basis of stereotypes). These then get put down in memory.

Why remember in generalities? It makes it easier to decide and act.

For your survival, it’s better to have a general rule “all snakes bad! avoid!” than a rule saying “well, snake with the greyish spots is bad, but the one with the orange ring on top of greyish spots is harmless“.

In the second case, you stop to examine whether that particular snake has an orange ring or not, and then the snake jumps on you and bites you and turns out to be poisonous. While in the first case, you don’t take the chance and instead run if you see anything even vaguely resembling a snake.

Many times you store these generalizations in your memory for later use. This allows you to act using your instinct.

Biases due to how your memory functions:

There are many kinds of memory cognitive biases, and many of the cognitive biases I listed in the previous categories also use aspects of memory biases.

One interesting memory bias is the Zeigarnik effect, where uncompleted tasks are remembered better than completed ones. I might write a longer article on these at a later point, but here is a Wikipedia list of memory biases that you can use to refer to in the meantime.

This post was brought to you thanks to my confirmation bias! 🙂

I had been learning about cognitive biases for a long time, but always felt that the list of them was always a bit haphazard and all over the place, and with some of the biases being very similar. There was no coherent framework and a way to classify them that could be used by the common people and applied in practical terms for their own lives.

I decided that a good framework could be cooked up by combining this stuff with evolutionary psychology. For me it makes sense (but maybe I am just making connections where none exist). After drawing up a skeleton framework, I went to search the internet to confirm my bias. 🙂 🙂 🙂

One of the few classifications that I came across was done by Buster Benson and is available here. There are some similarities with my classification, but also differences.

For example I put the search for meaning as the end goal of all cognitive biases dealing with information, while he has it as one of the categories. Also for me, doing things quickly is an inherent attribute of all biases.

Most importantly, apart from the information category, for me the ego and social animal categories are incredibly important as a way of understanding certain cognitive biases.

Another categorization I found is this one called the SEEDS model from the NeuroLeadership Institute. And one final categorization is this one done by a poster over on the “Less Wrong” blog.

There are also some academic articles that confirm my ideas and could shed a further light on them: here, here, here, and here.

The danger of falling for cognitive biases is now greater than ever. The conditions that humans have lived for in prehistory have changed significantly and continue to change at an enormous pace.

While in the past, the average human would stay part of one community for a long time, in this day and age, we all are parts of multiple communities at the same, some of which don’t overlap at all.

More importantly, city life paradoxically also leads to more social isolation. You are surrounded by thousands of people, yet feel alone.

How to overcome this bias? That’s not a simple thing to do, but there are some tools that can lessen it. For example paying more attention to your basic assumptions or maybe using Bayesian thinking. I plan to do more articles on debiasing in the future.

Of course one final thing to remember is that all models are just approximations of reality, but not reality itself. The map is not the territory.

All models are wrong but some are useful.” George Box

There are still a lot of things that we don’t know about the human brain and how it works. The frameworks I came up with can be useful when trying to think rationally in your life, but can and will be improved as time goes by.

Dr. Brainiac: “Mr. Chimp, you are such a perceptive individual. You know what you want and do everything to achieve it, however some things have still eluded you. Right?

Smart Chimp: “You just get me, man!

Please share, comment, link and like! 🙂


Read More:
How to be a critical thinker

Part 1 of the Cognitive Biases Framework

Credit: 1

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.