Bayesian Thinking: If You Want To Be A Critical Thinker You Need To Understand This Concept

It is the middle of the Cold War. Tensions are high and the United States wants to be ready to retaliate against any Soviet nuclear strike or do a first strike if needed.

In order to be able to have the capability to react fast, General Thomas S. Power initiates an operation called “Chrome Dome”, which has B-52 bombers armed with thermonuclear weapons continuously flying on a set route reaching certain points close to the Soviet border.

As part of this operation, early on the 17th of January 1966, a B-56G bomber of the United States Air Force, takes off from the Seymour Johnson Air Force Base in North Carolina. It is carrying 4 hydrogen bombs.

At 10:30 local time, over the coast of Spain, it begins a routine refuelling with an air tanker plane.

However there is a misunderstanding and as the procedure is about to begin, the tanker plane collides with the fuselage of the bomber, causing the bomber’s left wing to snap off. A huge explosion destroys the air tanker and severely damages the bomber.

All people aboard the air tanker, as well as some aboard the bomber die instantly. The rest of the crew of the bomber manage to parachute to safety.

The wreckage falls to the ground near a small village on the coast called Palomares. The nuclear bombs land nearby as well.

Three of the bombs are recovered relatively quickly (two are partially damaged however and cause nuclear leaks on the ground), but the fourth is nowhere to be found.

The guys searching for the bomb look at the evidence and decide that it had probably been blown over the sea by the wind and so is probably lying somewhere at the bottom of the Mediterranean.

They are facing a dilemma. If damaged, the bomb could cause great harm. If undamaged, it could fall into enemy hands. Cost what it may cost, it needs to be found.

What to do?

Put yourself in their shoes. There are some things that you do know.

A tail plate of the parachute was recovered, leading to the high probability that the bomb’s parachute probably deployed.

You have a probable eye witness account. A local fisherman says he saw the bomb enter the water. He points out the location where he saw it.

You also have a detailed map of the seabed in that area.

Enter John P. Craven.

The chief scientist in the US Navy’s Special Projects Office, Craven had a particular knack for solving impossible problems.

His key strength was his versatility. He described himself as being: “sort of educated in everything.

He had a Bachelor of Arts degree from Cornell University, a Master of Science in Physics from Cal Tech, a PhD in mechanics and hydraulics from the University of Iowa, and a Law degree from George Washington University.

While at the University of Iowa, he decided to take all kinds of courses in subjects as varied as journalism, or philosophy. He was never the top student, and supposedly only got a C in statistics.

However this varied training gave him many advantages over people who had concentrated on only subject.

Craven was a true Renaissance Man, with a head full of different mental models.

This was mission impossible as far as anybody was concerned. Finding the nuke in the middle of the open sea was akin to finding a needle in the haystack.

Craven couldn’t sleep and stayed awake for nights thinking about how to solve this seemingly hopeless problem.

Then it hit him. Why not use an old statistical method called Bayes Theorem. He had already seen it be used in some other very specific contexts, but it had largely been forgotten by the rest of the world.

It might actually work, he thought to himself.

This theorem had been around for two centuries already. It had initially been formulated by Thomas Bayes in the 18th century and was further worked on by Pierre-Simon Laplace. However in mid-20th century, most people didn’t know about it or considered it not very useful.

The theorem describes the probability of some event happening given a set of prior knowledge related to the event.

This can then be applied to try to figure out the best solution to a problem. Steven Novella in his blog post gives a good summary of the process:

Begin with an estimate of the probability that any claim, belief, hypothesis is true, then look at any new data and update the probability given the new data.

Craven assembled a team of experts in different subjects and then made them make guesses on where the nuke could have landed based on estimated conditions and their expertise.

They looked at different aspects of the problem, calculated different wind currents and speeds, looked at different scenarios with the bomb’s parachutes, and many other potential factors.

The detailed map of the sea bottom was brought out and divided into quadrants. The team members would calculate their odds for how likely all the different quadrants were to contain the bomb and then marked them on the map.

If the search ships got to the quadrant with the best odds and found nothing, then this would add to the analysis, the odds would be recalculated, giving higher odds to all the other quadrants.

After a period of fruitless searches, the team decided to again look at the testimony of the local fisherman who had apparently seen a parachute land in the water nearby where he was fishing. They used this and combined it with their latest highest odds quadrants to pinpoint a hot location.

Sharon Bertsch McGrayne describes what this meant in her book “The Theory That Would Not Die”:

Soon Orts’s testimony formed the basis for a high- likelihood hypothesis: with one parachute deployed, the bomb had plunged into a steep, deep- water canyon filled with tailings from an old lead mine. Mooney drew a one- mile radius around Orts’s spot and named it Alpha I.

This was what they needed. Finally they found the bomb lying in a narrow crevasse, deep on the sea floor.

Based on the lessons learnt from this entire endeavor, a new method of searching for lost objects was formulated. It is called Bayesian search theory and consists of several steps:

1) Formulate different hypothesis based on the things you know about the object and the situation.

2) For each of these hypothesis state a probability for where the object could be located.

3) Plot them on a map.

4) Go search in the spot with the highest probability.

5) If it is not found there, recalculate the probabilities based on this new state of affairs and go search in the spot with the highest probability after this latest recalculation. Continue working like this until the object is found.

This type of search pattern has since been used in many search and rescue missions for missing submarines, boats, or airplanes.

Well, the point is that this type of thinking could help you out in your own life. Remember my article on critical thinking and the base rate fallacy?

These different cognitive biases can be avoided if you adopt Bayesian thinking.

This way of thinking helps you get rid of black and white explanations of the world and instead view things and explanations through the lens of probability.

You start off with one view of the world based on evidence, and if new evidence is introduced, the probability of your initial worldview changes.

John Horgan in his blog post on “Scientific American” boils down the essence of Bayes Theorem and how it impacts your views:

Bayes’ theorem is a method for calculating the validity of beliefs (hypotheses, claims, propositions) based on the best available evidence (observations, data, information). Here’s the most dumbed-down description: Initial belief plus new evidence = new and improved belief.

So your certainty in your beliefs is not steadfast, but instead fluid. You should be able to modify your opinion based on new evidence.

As Sharon Bertsch McGrayne writes:

We modify our opinions with objective information: Initial Beliefs + Recent Objective Data = A New and Improved Belief. Each time the system is recalculated, the posterior becomes the prior of the new iteration. It was an evolving system, with each bit of new information pushed closer and closer to certitude.

This type of thinking can help you lessen the impact of confirmation bias and instead open up your views to new possibilities.

Another use of Bayes Theorem that I already wrote about in my article on survivorship bias is judging the likelihood of one hypothesis happening over another.

The central premise, first principle if you will, is the idea that most things in this world are uncertain. A lot of times you do not have perfect information, you don’t know everything and you need to make inferences.

Bayes Theorem allows you to live in this type of reality and make sense of things.

It informs your decision-making in a world of uncertainty. As new information comes in, you need to reflect on how this new evidence changes your view on things and then make course corrections based on it.

This view is gaining ground and more people are including Bayesian thinking in their daily lives or work.

However it was not always so and historically this notion has been quite controversial, even among scientists. At the center of this controversy lies the way that science has traditionally been viewed.

Another quote from Bertsch McGrayne clarifies this prejudice against Bayesian thinking:

Bayes runs counter to the deeply held conviction that modern science requires objectivity and precision. Bayes is a measure of belief. And it says that we can learn even from missing and inadequate data, from approximations, and from ignorance.

Bayesian thinking is growing as people are starting to acknowledge the inherent imperfections in the way humans think and make decisions.

For a long time, the classical model of economics viewed humans as rational actors, perfect in their decision making based on enlightened self-interest.

Now we are starting to realize that this view is flawed, and instead the view of behavioral economics of humans as falling prey to cognitive biases is becoming more prevalent.

Bayesian thinking is also a good approximation of how we learn.

For Nate Silver in “The Signal and the Noise”, Bayes Theorem is a statement in itself:

It is, rather, a statement—expressed both mathematically and philosophically—about how we learn about the universe: that we learn about it through approximation, getting closer and closer to the truth as we gather more evidence.

Bayes Theorem is an important tool to have for any rational and critical thinker.

The scientific formula is this:

Using this theorem you can get a better perspective on possible explanations and decide the best courses of action. It allows you to get feedback as new evidence comes in and thereby allowing you to adjust course.

This type of thinking is not perfect and shouldn’t be used in all situations. It particularly fails at times when there is no prior evidence to base the initial odds on. So one of the most important things you will need to learn when applying the theorem is when to use it and when not to.

Smart Chimp takes a bunch of cans and starts banging them against the table.

Dr. Brainiac: “Mr. Chimp, you see you are misbehaving again. Instead, you should be thinking about how thinking in probabilities could improve your life.

Smart Chimp: “The probability that you are an idiot rises with every second.

Dr. Brainiac: “Oh come on Mr. Chimp!

Smart Chimp: “Wait, I have just recalculated the odds based on this new evidence. Yes still rising!

 

Read More:
How to be a critical thinker

Beware of Advice: what can you really learn from successful people?

—-
Image: 1

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright Renaissance Man Journal 2017
Tech Nerd theme designed by Siteturner
shares
%d bloggers like this: