Thinking, Fast and Slow - A summary
In case you don't want to read all 500 pages...
23 Sep 2019Background
I don’t always read, but when I do, I forget everything about the book I’ve read within a matter of weeks. I finished Thinking, Fast and Slow about two weeks ago now, so I figured it would be useful to summarize it for myself, and anyone else not inclined to read this rather large, occasionally-dry book.
Thinking, Fast and Slow is a 2011 book by Daniel Kahneman, a Nobel Prize-winning psychology researcher. It received many awards, and is a great way to learn about behavioral psychology. Here’s my brief review and summary.
My Thoughts
Kahneman’s book is very well-organized, and easily comprehensible for the layperson. It covers a large amount of concepts, detailed below. However, I think he could have covered much more in the course of 500 pages. He backs up each concept he presents with an immense amount of psychology studies. While many of the studies are fascinating, and reinforce the fact that the human mind is very fallible, the number of examples given can start to feel repetitive and, at times, excessive. Nonetheless, the simplicity with which he describes each experiment helped convince me that these psychological phenomena are “real”, and aren’t just the result of faulty research - each concept is very well-supported by countless studies. A key takeaway I got from the book is that, despite Kahneman having researched the brain for decades, he himself is unable to control many of the irrational ways that he (and everyone else) thinks. Thus, the book should not be viewed as a self-help book, but rather a way to perhaps make a few better decisions in your lifetime. The brain is not easily trained to avoid “fast”, irrational thinking.
The Main Idea
The Two Systems
The first part of the book is dedicated to describing the two systems that the human mind employs on a daily basis:
- System 1 (fast): The automatic, fast, generally unconscious way of thinking
- System 2 (slow): The effortful, slow, generally conscious way of thinking
Kahneman explains how, for example, when an experienced driver is behind the wheel, she is generally using System 1. The brain is doing a lot of work to stay in a lane, look out for other reckless drivers, and even listen to music at the same time. But all of this work is done automatically, without much effort. However, given a tricky situation on the road, such as a tough merge or a complex detour, System 2 can take over, in order to do the deliberate work necessary to evaluate and handle the more difficult situation. In general, System 1 is the part of the brain that is subject to a plethora of biases, whereas System 2, when invoked, can make much more logical decisions. Of course, System 2 takes effort to use, so System 1 tends to be used much more than System 2.
Biases/Heuristics
There are frankly too many biases in the book to cover here. However, almost all of them stem from the idea of WYSIATI (what you see is all there is). We love to come up with answers as immediately as possible. Therefore, it is much easier to make decisions based on things we see in front of us. If what we see in front of us isn’t actually important to making a decision, too bad! Our brain will use some heuristics to make a bad decision. What you see is all there is when it comes to System 1 making immediate decisions. Here are a few, more specific examples of biases/heuristics employed by System 1 that are explained in the book:
Substitution
The idea behind substitution is that, if we are asked a question, and we do not know the answer to the question itself, our System 1 will substitute an easier question for the question asked of us, and then “look up” the answer to the easier question. A great concept raised in the book is our inability to use “base rates” when assessing the likelihood of outcomes. Consider “Tom”, an undergraduate student described as “Extremely analytical. Loves sci-fi movies, computer games, and sometimes struggles socially.” What is Tom most likely to major in?
- Computer Science
- Psychology/Humanities
- Law
- Library Sciences
If you chose 1, so did I. However, your brain substituted the question of probability for the question of similarity. Of course, Tom sounds like he’s more of a Computer Science major. However, if we consider that Psychology is a much more popular major than Computer Science, it’s still much more likely that Tom is a nerdy Psychology major. When making probabilistic decisions, we should always consider the “base rate”, and then adjust the base rate based on additional information we know. For example, if 10% of undergraduates study Psychology, and 2% study Computer Science, maybe given what we know about Tom we can adjust those probabilities to 8% and 4%. But to think that the probabilities would completely reverse based on a two-sentence description of Tom is incredulous when thinking slowly about probability.
Framing
Med students were once asked whether to perform the same surgery. Some were told that 10% are killed when undergoing the operation, others were told that 90% survive. Of course, the ones told about death were less likely to perform the surgery. Another study researched the likelihood of people electing to donate their organs in the event of a motor vehicle accident. After a very in-depth study, the researchers found that by far the biggest predictor of whether people elected to donate was the default option when signing up for a license. If asked to opt-out of donation, people tended to donate. If asked to opt-in, people tended to not donate.
Overconfidence
Not much to say here, but we are irrationally overconfident. When small business owners were asked how likely they were to succeed, they always quoted probabilities much higher than the rate at which small businesses succeed. Even when told the average rate of success, they still said a probability much higher than the average rate of success!
Prospect Theory
The most interesting part of the book was prospect theory. The governing idea of economics has been that rational agents will maximize their expected utility. However, Kahneman and his friend Amos Tversky set out to figure out how Humans, not Econs, make decisions. Essentially, humans are risk-averse. They do not simply maximize expected utility. The wikipedia page elegantly explains:
- Faced with a risky choice leading to gains, individuals are risk-averse, preferring solutions that lead to a lower expected utility but with a higher certainty (concave value function).
- Faced with a risky choice leading to losses, individuals are risk-seeking, preferring solutions that lead to a lower expected utility as long as it has the potential to avoid losses (convex value function).
In simple terms, which would you rather?
- A 100% chance of winning $1000
- A 90% chance of winning $1150
Prospect theory predicts #1. Expected utility theory expects you to prefer #2.
How about here?
- A 100% chance of losing $1000
- A 90% of losing $1150
Prospect theory predicts #2. Expected utility theory expects you to prefer #1.
Summary
In the end, I think the book was an eye-opening way to see how irrational humans are. However, I think reading this blog post (or a better one) might be able to quench any thirst you might have to learn a little bit about behavioral psychology, without poring through hundreds of summaries of studies.