Book Summary: Thinking Fast and Slow by Daniel Kahneman
EVALUATION
Thinking, Fast and Slow is, above all, about how frequently we make poor decisions without even realizing it. As such, it is also a guide to identifying those areas and times when you are likely to make poor decisions. Nobel laureate Daniel Kahneman is the author, and his intelligence and experience is evident both from the breadth of academic studies cited and the authority with which he applies the evidence to a range of intriguing problems. This really feels like a very wise man trying to cram a lifetime of lessons into one book…and by and large succeeding!
The book requires more concentration than typical Levitt/Gladwell style non-fiction, though it’s still in the same realm of academic-wants-to-write-a-bestseller. More studies are cited in greater detail in Thinking Fast and Slow, the academic muscle pulses more visibly beneath the surface. The data is better, the conclusions more plausible and the research more important…but the book is less compelling. It has actually taken me quite a long time to read the whole book. Sometimes I even found it draining, in a way that I never felt with the likes of say, Outliers. I suspect that it is because the book is, with the exception of the author and various mentioned colleagues, without characters. That said, everytime I picked it up and read a section I felt like I realized something fairly profound, or that Kahneman had drawn out and examined a poorly developed inkling I’d been quietly harboring. In this sense, the book is almost like a reference book – you can open it at any point (after reading the introduction) and it immediately offers you value. The closing chapters on their own, on happiness and the experience of life, are worth far more than the book price.
For anyone interested in the psychology of sales, marketing and investment, the exploration of how people make choices in part four contains highly relevant information. It is very striking how reframing problems in terms of loss and gain can produce totally different reactions in otherwise intelligent people. For businesses trying to weigh up possible risks, the ‘premortem’ technique in section three is incredibly helpful. For sportspeople puzzling over their success or lack there of, the section on regression to the mean will be deeply illuminating. Now that I think about it, the widespread applicability of the content of this book is quite remarkable.
In summary, I really feel that you would struggle to read the book and not come away with something of value. Highly recommended.
SUMMARY
Part 1: Two Systems
- A key concept in the book is the characterization of two methods of thinking. All people employ both methods, which Kahneman names ‘system 1′ and ‘system 2′. System 1 is automatic – it determines if someone is happy when you look at them, it judges distance, and it forms first-impressions. System 2 is slow and deliberate – it comes into play when someone asks you: ‘what is 17 x 58′. The key point of the opening section is to highlight that often we think we are using system 2, when in fact we are using system 1.
- People have a tendency to ‘substitute’ easier questions when asked difficult questions. This is an unconscious reversion to system 1, when the question really demands system 2′s detail. For example, when asked ‘are you happy?’ people will often answer the question ‘how did the past few days make you feel’ rather than the much more profound and difficult question. This is done automatically.
Part 2: Heuristics and Biases
The Law of Small Numbers
-
We are natural pattern seekers. Consider the following sequences of births at a hospital:
-
BBBGGG
-
GGGGGG
-
BGBGBG
-
These are independent events, so each sequence is as likely as the other – but “when we detect what appears to be a rule, we quickly reject the idea that the process is truly random […] we are far too willing to reject the belief that much of what we see in life is random” (pp.115-117)
“The exaggerated faith in small samples is only one example of a more general illusion – we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify.” (p.118)
Anchors
- The anchoring effect occurs “when people consider a particular value for an unknown quantity before estimating that quantity” (p.119)
As you may have experienced when negotiating for the first time in a bazaar, the initial anchor has a powerful effect. My advice to students when I taught negotiations was that if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear – to yourself as well as to the other side – that you will not continue the negotiation with that number on the table. (p.126)
- “The main moral of priming research is that our thoughts and our behavior are influenced, much more than we know or want, by the environment of the moment” (p.128)
The Availability Heuristic
-
People have a huge tendency to make inferences from data that is too small. A sufficiently large sample size is critical.
-
“The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind. Substitution of questions inevitably produces systematic errors” (p.130)
-
e.g. “A salient event that attracts your attention will be easily retrieved from memory. Divorces among Hollywood celebrities and sex scandals among politicians attract much attention, and instances will come easily to mind. You are therefore likely to exaggerate the frequency of both Hollywood divorces and political sex scandals.” (p.130)
Affect Heuristic
- Study in which participants had to estimates more frequent causes of death:
“Strokes cause almost twice as many deaths as all accidents combined, but 80% of respondents judged accidental death to be more likely […] Death by accidents was judged to be more than 300 times more likely than death by diabetes, but the true ratio is 1:4” (p.139)
- The availability of information (typically through the media), distorts the decisions people make, e.g. people taking the train instead of flying after a high-profile plane crash. This is the availability bias. “The lesson is clear: estimates of causes of death are warped by media coverage” (p.139)
Representativeness
“One sin of representativeness is an excessive willingness to predict the occurrence of unlikely (low base-rate) events. Here is an example: you see a person reading The New York Times on the New York subway. Which of the following is a better bet about the reading stranger?
-
She has a PhD.
-
She does not have a college degree
Representativeness would tell you to bet on the PhD, but this is not necessarily wise. You should seriously consider the second alternative, because many more nongraduates than PhDs ride in New York subways” (pp. 151-152)
Causes trump statistics
The experiments with the seizure victim and the electric shock victim – results show:
- “Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular” (p.174)
- “But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a causal story.” (p.174)
Indeed, the statistician David Freedman used to say that if the topic of regression comes up in a criminal or civil trial, the side that must explain regression to the jury will lose the case. Why is it so hard? The main reason for the difficulty is a recurrent theme of this book: our mind is strongly biased toward causal explanations and does not deal well with ‘mere statistics’. (p.182)
- “Depressed children treated with an energy drink improve significantly over a three-month period […] Most readers of such headlines will automatically infer that the energy drink or the cat hugging caused an improvement, but this conclusion is completely unjustified. Depressed children are an extreme group, they are more depressed than most other children – and extreme groups regress to the mean over time.” (p.183)
- “Regression is also a problem for system 2. The very idea of regression to the mean is alien and difficult to communicate and comprehend. (p.194)
Part 3: Overconfidence
- Outcome bias results in poor evaluations of judgments (stupid decision can still work out well due to factors beyond one’s control).
- The world is very uncertain. This is a massively under-appreciated fact. “declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true” (p.212). Kahneman refers to these instances as the *illusion of validity. *This is often cultivated due to corporate expectations. A CFO can’t say that he’s very sure what is going to happen next quarter – he’d be out the door! But often, that’s the hard truth.
- Here on p.212 Kahneman describes an experiment where he proved that professional stock picking firms rewarded luck as if it were skill.
- For these reasons, replacing human judgement with formulae overwhelmingly results in better results over large sample sizes.
- The concept of the *premortem *is fascinating and so I will quote it in full:
The procedure is simple: when the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: ‘Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write brief history of that disaster’ […] The premortem has two main advantages: it overcomes the groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgably indivudals in a much-needed direction.(pp.264-265).
Part 4: Choices
- Reference points are critical to evaluations – in particular the concept of utility. E.g. a $20,000 bonus today doesn’t have the same utility have the same utility as three years ago if you were earning less money then.
- People are more likely to take risks when all their options are bad – people are generally loss averse when it comes to taking risks – typically weighing losses about twice as much as gains. This means that those who stand to lose will typically fight harder than those who stand to gain.
- The endowment affect occurs when a person already owns something and is asked to part with it – the price they will require to part with it would be greater than what they originally would have given for it. A good example of this is people who have tickets to a concert who are asked to sell the tickets. This is again linked to reference points and loss-aversion – it involves the pain of giving up something.
- People generally overweight small probabilities, resulting in a willingness to pay disproportionately large amounts for certainty (for example court settlements).
- People frequently overestimate rare and unlikely events (e.g. natural disasters)
- Manipulating emotions is easy by switching view points (e.g. in medical treatment lives saved vs. probability of death). This is known as reframing and has a huge number of applications.
- Loss aversion across an organisation can lead to the organization as a whole failing to take enough risk. Causes of loss aversion typically result from judging decisions based on their outcomes, rather than the quality of the decision at the time, given the available information at the time.
Part 5: Two Selves
- “Confusing experience with memory of it is a compelling cognitive illusion – and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self. (p.381) [e.g. the guy with the scratched music cd which ruined the ending of otherwise great 40 minutes]
- “The memory that the remembering self keeps […] is a representative moment, strongly influenced by the peak and the end.” (p.383)
- “The evidence presents a profound challenge to the idea that humans have consistent preferences and know how to maximize them, a cornerstone of the rational-agent model. (p.385)
Part 5: Life as a Story
- “The goals that people set for themselves are so important to what they do and how they feel about it that an exclusive focus on experienced well-being is not tenable. We cannot hold a concept of well-being that ignores what people want” (p.402)
- This phenomenon is the *focusing illusion: *“Nothing in life is as important as you think it is when you are thinking about it” (p.402)
- “Adaptation to a new situation, whether good or bad, consists in large part of thinking less and less about it” (p.405) – This is why you should think about the things for which you are grateful.
- “It appears that the remembering self is subject to a massive focusing illusion about the life that the experiencing self endures quite comfortably” (p.406). Profound stuff.