Le’ Notes #20: Heuristics and biases

This post serves as an introduction to the heuristics and biases school (HB) and how it might be applied in assessing war decisions.

One of the features of the naturalistic decision-making (NDM) school is their belief that people can be trained to make better decisions by becoming experts and developing better mental models. The rationalistic school already assumes that people are similar to robots, i.e. they make sound, rational judgments based on the availability of information at that given time.

However, the HB school would beg to differ. People are inherently biased in making decisions, mostly due to their reliance on intuition, which stems from a number of heuristics that we have developed as a part of the evolutionary process.

One of the foundational texts of the HB school would be Thinking, Fast and Slow by Daniel Kahneman. As such, I have used it extensively in assessing the decision-making process of several wars throughout my Study of War class.

The HB school can be thought as a direct opponent of the NDM school. There are stark differences between the two. First, the NDM school are highly optimistic of the human capacity to learn; whereas the HB school is quite pessimistic. This is due to the assumption of the HB school that so long as people rely on their heuristics, their decisions will almost certainly contain biases. This applies even to the most seasoned experts, which the NDM school would claim to be able to make the best judgments. Second, the NDM school believes in observing people in their ‘natural’ habitat, i.e. in a condition of high stress, uncertainty, and ill-defined objectives. However, the HB school is mostly comfortable with a more “scientific approach”, where most of their experiments are rigidly controlled. Of course, this raises some questions on the applicability of the HB school in assessing war decisions.

System 1 and System 2: fast and slow

Coming back to the HB school, represented by Kahneman. Kahneman believes that there are two systems of thinking: System 1 (intuition) and System 2 (deliberative thinking).

We usually operate in System 1 thinking, which is automatic, involuntary, and swift. This system is often related to our “fight or flight” compulsion when confronted with a threat, which was a great help to our ancestors when they lived as hunter-gatherers. Surely, the slowest person to run from a bear would not live on to pass their genetics. System 1 helps us effortlessly navigate through the world, quickly sensing changes in the environment and then helping us form appropriate courses of action.

However, just because it’s quick doesn’t mean it’s always right. Often times, accuracy is sacrificed in favour of speed. A quick glance at someone brandishing a gun will trigger our defensive mechanisms. However, the gun might be a toy gun. A classic experiment that shows the pitfalls of System 1 thinking is the following problem:

A bat and ball both cost $1.10. The bat costs $1 more than the ball. How much is the ball?

Intuitively, the answer that first comes to mind is $0.10. However, that answer is wrong. The correct answer is $0.05. That classic problem shows that quick thinking might not always be correct.

That’s when System 2 kicks in. Unlike System 1, System 2 is more deliberative and slow. System 2 happens when you start thinking, “Wait a second, that doesn’t sound right…” Usually, System 2 kicks in when you’re in a situation that requires you to draw more brain resources in order to solve the problem at hand. For example, you will stop to think before attempting to solve a complex math problem. Unlike System 1, System 2 is less prone to biases resulting from heuristics.

Heuristics and biases

So, what are these heuristics that often lead to biases? There are three main heuristics outlined in Kahneman’s book: availability, representativeness, and anchoring. A heuristic is generally understood as simple rules which people adhere to when making decisions. Think of them as “rules of thumb”: not always correct, but generally correct. From each heuristic, there’s a long, long list of biases related. For now, let’s just focus on the heuristics.

The availability heuristic is related to the availability of known events from memory. For example, you might think that terrorists kill more people than cancer, ergo, terrorists are more dangerous than cancer. Or that mainland Chinese are coming to Indonesia to steal jobs. That’s what I like to call an “overblown minority”. In fact, terrorists don’t kill that many people; cardiovascular disease kills more than terrorists. Yet, a person dying from being morbidly obese rarely makes the papers compared to the shooting at Charlie Hebdo.

Hans Rosling (below) did an impressive TED Talk that illustrated our tendency to draw conclusions simply due to the magnitude of how events are portrayed by the media. We tend to think things are worse because that’s what the media reports on. That’s where the availability heuristic does not work in our favour.

The representativeness heuristic is understood as the tendency to categorise a new occurrence into a larger category of pre-existing occurrences. This is where racism comes from: you intuitively associate attributes to a known stereotype you have in mind. For example, if I were to explain the following:

Daniel is a man that lives in a poor part of the city. He loves eating fried chicken and usually plays basketball with his friends after school.

What came to your mind first? You might consider that a stereotypical explanation of a young black person in America. That just means you’re racist. Daniel might be an Indonesian boy.

The anchoring heuristic is the tendency to make judgments based on the first piece of information one receives. This is prevalent in cases of haggling between buyers and sellers. Buyers would tend to stay closer to the “anchoring” price if the seller says it first. To illustrate how the anchoring heuristic works in military planning, consider first the nature of conjunctive and disjunctive events. A conjunctive event requires the previous event to be successful before the present stage can start. For example, when planning the attack on Incheon, MacArthur had to go through these events:

  1. Land at Incheon
  2. Take Seoul
  3. Advance across the 38th Parallel
  4. Defeat North Korea and let the eagles of freedom caw the national anthem over the Korean Peninsula
  5. Have a beer and hang out with bikini-clad bimbos

Anchoring occurs when the probability of success of the entire operation is hinged on the success of the initial event. In the case of conjunctive events, the probability of success tends to be lower, since we need to consider ALL of the steps. But, when we anchor our judgment to the likelihood of success in Step 1 (and Step 1 only), we tend to overestimate the probability of success. As MacArthur demonstrated, he did succeed in landing at Incheon, but the success rates of Step 3 onwards declined.

Now, in the course of disjunctive events, there is a tendency to underestimate the rate of success when there are many systems involved in a system. A disjunctive event is concerned with that of small components, which failure may or may not significantly influence the outcome of events. In a war setting, these small components may be logistics, communications, and ammunition. Though the probability of failure is small among these componentsthe risk of failure increases as more and more variables are included. However, we tend to anchor our judgment in the first thing we see, which leads to underestimation. For example, we might be tempted to think that since guns are unlikely to jam, that there will be little satellite interference, and our armoured carriers will not run out of gas,  the whole operation will be a success.

In other words, Clausewitz was right about friction.

The bells ring…

Here’s a quick recap on the heuristics and biases:

  • The HB school tries to deliver a rigidly, scientifically based method of explaining how people make decisions. Their central assumption is that the use of heuristics (or intuition) often leads to errors in judgment.
  • There are three heuristics that Kahneman observed to be most influential in shaping intution: availability, representativeness, and anchoring.


One thought on “Le’ Notes #20: Heuristics and biases

Comments are closed.

Website Powered by WordPress.com.

Up ↑

%d bloggers like this: