You may have nothing to do with mathematics. Or you’re an experienced statistician. It doesn’t matter when you make your everyday decisions. Our brains just can’t deal with probabilities – even if we have profound knowledge of underlying theorems. When we have to reason intuitively, we make elementary mistakes.
This discovery proved to be such a breakthrough in social sciences that in 2002 it gave Daniel Kahneman, an Israeli-American psychologist, a Nobel Prize (precisely, the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel). The jury awarded him, together with an economist Vernon L. Smith, “for having integrated insights from psychological research into economic science, especially concerning human judgement and decision-making under uncertainty”1. Reading Kahneman’s bestseller “Thinking, Fast and Slow” is the best excuse to procrastinate. You’ll be surprised by tricks our brains play!
On 557 pages (Polish version) there was quite a lot of space to reveal our biggest sins when it comes to assessing the uncertainty. Let me summarise the most shocking ones, following the paper “Judgment and Uncertainty: Heuristics and Biases” (Tversky and Kahnemann, 1974).
Ignoring the prior probability.
Often we need to assess the likelihood that some object A belongs to a class B. Or that an event A will result in an event B. We tend to forget about the Bayes’ theorem and completely ignore the prior distribution. If you’re not familiar with this notion, consider the following example.
Imagine that someone tells you that her neighbour wears glasses and a white beard, seems to have lost contact with reality and it’s hard to understand him. Would you say that works as a philosopher or a farmer? Probably your intuition tells you that your friend’s neighbour is a stereotypical professor of philosophy. However, the question was about the probability of him working in a given profession. Thus we need to consider the proportion between philosophers and farmers in UK before we think about his description.
Kahneman also described an experiment where volunteers were given personality descriptions of a few people that were supposed to belong to a group consisting of 30 engineers and 70 lawyers (in one case) or 30 lawyers and 70 engineers (in another). The common sense tells us that we can’t fully trust such descriptions; a short description of a personality doesn’t determine the profession. Thus we would expect that in the first group volunteers would label a given person a lawyer about five times more often than the second group (because the ratio of odds is 2x(0.7/0/3), so 5.44). However, there were no significant differences between these groups, so they completely ignored the prior probabilities. They didn’t make the same mistake when personality descriptions were not available. What’s even more interesting, when the description wasn’t informative (something like “Tom has a wife and two kids. He is hard-working and has many successes in his field”), they assessed the probability to be about 50%, regardless of the group.
Remember! When there’s no extra information, base your assessment on the prior probability!
The law of small numbers.
Imagine that we randomly measure the height of five men. What’s the average result? You might be tempted to respond with a number close to the average height of a British man. However, in such a small sample we can get very short or very tall men – and the mean height will be far from 177 cm. According to Kahneman, surveyed people give the same average for samples of size 10, 100 or 1000, forgetting that as a rule of thumb, the averaging works only for large samples.
Now consider two hospitals: a small, rural one, and a big one. For both institutions we count how many days during one year they noted boys constituting over 60% of newborns. In your opinion, which hospital had more such days? We can assume that in general about 50% of newborns are boys. The more kids are born, the closer to the average we should get, so the extremes would happen in the smaller hospital. However, people tend to say that the probability of such event is the same for both hospitals. Yes, the process is the same – but numbers matter.
This is the sin also of scientists. Some tend to overestimate the significance of their “discovery”. When we use a small sample, we can discover unbelievable things!
Remember! Small samples don’t need to be representative for the population!
Misconception of randomness.
There’s an anecdote about a professor who gave their students a homework: throw the dice 100 times and write down the results. Most of students were too lazy to do that and just wrote down “random” (in their opinion) sequences of H (heads) and T (tails). The professor knew who did the job. Lazy students avoided long repetitions of consecutive H or T. So nobody would write: HTHHHHHHHHHTH. Why?
We tend to think that some sequences are more random than others. That when we get a long sequence of the same result, the randomness will “correct” itself later on. Well, it won’t. When we throw a fair coin, the probability of throwing head or tail always equals 0.5, independent of previous throws. Yes, the ratio of heads and tails should be close to 1 – but when we throw the coin infinitely many times. Good luck!
Remember! A random sequence doesn’t have to look “random”!
Misconception of regression.
Flight instructors noted that the best way of training is punishment. They say that after a praise, the trainees do worse in the next flight, but a punishment improves their performance. So should we punish rather than praise?
Not really. Regression means that results tend to “come back” to average. So when a pilot’s performance was better than average, he will probably do worse during the next flight, regardless if he was praised, punished or ignored. After a horrible flight, he will most likely do better next time.
Similarly, children of tall parents tend to be shorter than they are; otherwise, tall families would become taller and short – shorter.
Misunderstanding of this process can have severe implications. For example, if after many successes a company starts to perform worse, most of us blame the company’s president. However, it might be the case that the regression is the culprit. Kahneman claims that probably many decent businessmen have been fired because company’s results started to come back to average.
Remember! In general, rich get poorer and poor get richer.
I could list many, many more popular misconceptions about uncertainty. Many of them seem to be easy to avoid. They are but only when we think carefully; sometimes it’s not possible, when we have to take an intuitive decision. The best we can do is to be aware of our tendencies and able to accept that we could be wrong. Just in case.