This is Part 2 of a 3-part series on Critical Thinking. You can find Part 1 here.
6) Anchoring Effects
This is the process, of anchoring, or using one piece of information, to make an inference about the final answer. This type of process actually happens pretty often and can be a major source of errors.
For example, it has been proven by studies that even giving a very way off base number in a statement can have an effect on the other person’s final answer, even if they know the initial number you gave is way off base.
Let’s use an example to illustrate this. You might ask the initial question:
“Is Mt. Kilimanjaro over or under 20 000 meters high?“
Most people will know that the number given is way too high. However subconsciously it affects their thinking. When they are asked the subsequent question of giving their estimation of how high the mountain is, the people who were anchored with this initial number on average give a higher number as their guess than the people who are asked to estimate straight away.
Let’s go back to the last two questions of the test I had given you previously. They were meant to demonstrate how the anchoring effect works subconsciously on your brain. The first question was set-up in such a way as to induce anchoring in answering the second.
Most people would guess correctly that the heaviest person was below 1 000 kilograms, however they would not be sure of his exact weight. The number 1 000 in question 4) served as the anchor for the mind. There were experiments performed on people answering similar questions with and without the use of anchors.
The results of such tests were different in each of the two groups. The people who previously had seen the anchoring number tended to reply with a number that was closer to the anchor than people without the anchor. Even though, they knew that the number given in question 4) was wrong, the mind still fixed on it and used it as a way to calculate the answer to question 5).
This type of anchoring can have powerful effects on things such as negotiating about a price. For example, if a person is selling something and they give a higher value for the price they want, this often serves as an anchor and the person trying to buy the thing will subconsciously use that initial price as an anchor to try to come with counter-bids.
Another application of this effect that has come up in studies also relates to price in negotiations. For example different studies have shown that more precise numbers such as 19.95, have more of an effect than round prices such as 20.
To cite one of these studies:
“The role of conversational processes in quantitative judgment is addressed. In three studies, precise numbers (e.g., $29.75) had a stronger influence on subsequent estimates than round numbers (e.g., $30), but only when they were presented by a human communicator whose contributions could be assumed to observe the Gricean maxims of cooperative conversational conduct. Numeric precision exerted no influence when the numbers were presented as the result of an automated procedure that lacks communicative intent (Study 1) or when the level of precision was pragmatically irrelevant for the estimation task (Study 2).”
7) Base Rate Fallacy
This fallacy deals with the way brain often ignores the base rate information. In this type of fallacy the brain focuses on specific information, but tends to forget the base rate, or more general information.
An illustration of this type of fallacy is the blue cab and green cab problem, where people forget about the information on the total color distribution of the cabs in the city, which is the base rate data, to focus on the indicative information provided by a witness. They judge that information as being more relevant as opposed to the general base rate information.
People often ignore statistical information to focus on one irrelevant fact. Sometimes this can result in some head-scratching conclusions.
8) Conjunction Fallacy
The brain has a tendency to often think of more specific information as being more likely than more general information, even though statistically this does not make sense. They look at the the representativeness of the description, but fail to take into account the probability aspect of the scenario. This results in the conjunction fallacy.
The conjunction fallacy occurs especially when people really on their impressions and neglect to step and consider the whole picture.
To illustrate, Daniel Kahneman’s book “Thinking, Fast and Slow” gives this example:
Picture this scenario:
“Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.“
Which of the two statements below is Linda more likely to be:
1) Linda is a bank-teller.
2) Linda is a bank-teller and is active in the feminist movement.
So what do you think?
If you are like most people, you probably picked the second option. However look at the statements again and carefully. Statistically is a person more likely to be a bank-teller or a bank-teller AND active in the feminist movement?
Notice that the first statement is more general, while the second statement is more specific. The people who picked the second statement as being more likely made the conjunction fallacy. In a general population, there are always going to be more bank-tellers than bank-tellers who are also feminists. So actually the first statement is the more likely scenario.
Humans are really bad at statistics. That’s because the mind is better at understanding causal explanations, as in A happened because of B, but not really good at grasping statistical explanations. That’s why storytelling is often such a powerful way of convincing and moving people. It paints easy causal pictures in the head, which humans are better at relating to.
Another thing that happens because of human illiteracy in common statistics is called the gambler’s fallacy. This describes how often people think that future events are affected by past events, even thought the two events are independent of each other.
Let’s go back to the questions on the coin tosses that I had you think about earlier. Many of you probably gave different percentages, because you thought that the fact that for example the coin landed heads five times in a row means that it should land tails soon.
However that is wrong. The next coin toss (assuming its a normal coin and not rigged in any way) is independent of all the previous coin tosses. So the probability of it landing tails is still 50%.
This fallacy is called the gambler’s fallacy because of the fact that it is often observed in casinos. There is a story from the Monte Carlo casino in 1913, when in a game of roulette the ball ended up falling on black 26 times in a row! The gamblers ended up losing millions of francs. This was all due to their perception getting the best of them.
Seeing the streak, the people sitting around the roulette table assumed incorrectly that this type of streak means that the probability of the ball landing on red increases with each time the ball lands on black. So they kept betting on red. Most of the gamblers went home deep in loss. This type of thinking happens in casinos all the time.
Now for a totally different type of mind failure, watch this video:
What happened there is called the Forer Effect. People tend to think that things tailored for them, but in fact given in a very general and broad language, are highly accurate. Horoscopes and fortune tellers work based on this principle.
Take a look at any horoscope and notice how it is worded in a very general language. When you read a horoscope, your brain looks for things that fit and discards the rest, giving you the effect of feeling that what is written is highly relevant for you and is true.
This is the same way cold reading works. For example, if you want to appear as a clairvoyant, you can give some of these very vague phrases to a person to describe their personality or thoughts, and they will be amazed by how insightful you are. 🙂