Thinking, Fast and Slow by Daniel Kahneman: Summary and Notes

Thinking Fast and Slow summary“Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.”

Rating: 8/10

Related Books:  How To Decide, Thinking in BetsJudgment Under UncertaintyChoices, Values, and Frames, Prospect Theory, Heuristics and Biases

Print Ebook Audiobook

Get all my book summaries here

 

Thinking, Fast and Slow: Short Summary

Thinking, Fast and Slow by Daniel Kahneman is one of the most detailed books on decision making. Kahneman covers each of our cognitive biases in great detail and even shares decision-making insights from his Nobel Prize-winning theory — Prospect Theory. A very informative read with the potential to transform your life for good.

Part 1: The Two Systems

The human brain is composed of two systems: System 1 and System 2.

System 1 comprises the oldest parts of the brain. It operates automatically and involuntarily. This system is always functioning and is responsible for most of the day-to-day activities. It is also responsible for our reactions to danger, novelty, and intuition.

System 2 allocates attention and completes tasks that require effort. System 2 is a newly evolved part of the brain, and only humans have a highly developed prefrontal cortex.

Chip & Dan Heath call the two systems the Elephant and the Rider in the book Switch.

The two systems help each other in decision-making. When System 2 is overwhelmed, System 1 takes over.

“Whenever you are conscious, and perhaps even when you are not, multiple computations are going on in your brain, which maintain and update current answers to some key questions: Is anything new going on? Is there a threat? Are things going well? Should my attention be redirected? Is more effort needed for this task? You can think of a cockpit, with a set of dials that indicate the current values of each of these essential variables. The assessments are carried out automatically by System 1, and one of their functions is to determine whether extra effort is required from System 2.”

Characteristics of System 1

  • It generates intentions, feelings, and inclinations. When these are endorsed by System 2, they become beliefs, attitudes, and intentions
  • Has no sense of voluntary control as it operates quickly and with little or no effort
  • It can be programmed by system 2 to mobilize attention when a particular pattern is recognized
  • Executes skilled responses after training
  • Creates a sense of cognitive ease to illusions of truth, reduced vigilance, and pleasant feelings
  • Differentiates the surprising from the normal
  • Infers and invents causes and intentions
  • Neglects ambiguity and suppresses any feelings of doubt
  • Is biased towards believing and confirming
  • Exergerrates emotional consistency
  • Focuses on existing evidence while avoiding absent evidence
  • Generates a limited set of basic assessments
  • It does not integrate sets but rather represents them by norms and prototypes
  • Matches intensities across scales,e.g., comparing size to loudness
  • Overcomputes
  • Sometimes substitutes easier questions with difficult ones
  • Is more sensitive to changes than to states
  • Demonstrates diminishing sensitivity to quantity
  • Responds more strongly to losses than gains
  • Frames decision problems narrowly in isolation from one another

Part 2: Heuristics and Biases

The Law of Small Numbers

The law of small numbers is the misguided belief that large numbers apply to small numbers as well.

For example:

If a survey of 300 older adults shows that 65% are likely to vote for a particular candidate, there is a temptation to conclude that a majority of elderly citizens will vote in the same way. This, of course, is not always true.

“We are likely to make statistical mistakes because we are pattern seekers. The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify. Jumping to conclusions is a safer sport in the world of our imagination than it is in reality.”

Anchors

Anchors are arbitrary values that people consider for an unknown quantity before encountering that quantity.

Anchors are known to influence many things, including the amount of money people are willing to pay for products they have not seen.

For example:

Asking whether Gandhi was more or less 144 years old when he died makes it more likely for respondents to assume Gandhi died at an advanced age.

System 2 is more susceptible to anchors and has no knowledge of their influence, to begin with.

The Science of Availability

The bias of availability happens when we give too much weight to recent evidence or experience.

Salient events that attract your attention are likely to be retrieved from memory.

For example:

Divorces among Hollywood celebrities are likely to attract more attention making the instances more likely to come to mind. As a result, you are likely to exaggerate the frequency of Hollywood divorces.

“A dramatic event temporarily increases the availability of its category. A plane crash that attracts media coverage will temporarily alter your feelings about the safety of flying. Accidents are on your mind, for a while, after you see a car burning at the side of the road, and the world is for a while a more dangerous place.”

Representativeness

We often rely on stereotypes to help us judge probabilities. 

For example:

When we see someone reading a copy of the New York Times on the subway, we are more likely to assume that they have a Ph.D. over not having a degree at all.

Representativeness helps us in making quick decisions where we do not have all the facts. The downside of representativeness is that it can lead to negative stereotypes.

Less Is More

When it comes to judgments of likelihood, more information about the subject can make it harder to arrive at the right conclusion.

For example:

Take the following two descriptions of a fictional lady called Linda:

  • Linda is a bank teller
  • Linda is a bank teller and is active in the feminist movement

In this case, the additional detail of Linda being involved in the feminist movement makes it less likely that she is a bank teller because the probability of Linda being a bank teller and a feminist activist is a less likely outcome than Linda being just a bank teller.

Causes Trump Statistics

When specific information about a case is available, base rates are generally underweighted and often neglected.

People are poor at making statistical decisions, and even when their faults are pointed out, it doesn’t help things.

“The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you have learned a new fact. There is a deep gap between our thinking about statistics and our thinking about individual cases. Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience.”

Regression to the Mean

Regression to the mean: Over time, extreme variables tend towards the norm

Despite being a widespread phenomenon, humans come up with many reasons to dismiss regression towards the mean.

For example:

Companies that outperform others in the market don’t do so in the long run. When their luck runs out, their performance tends towards the norm, i.e., regress to the mean.

Part 3: Overconfidence

The Illusion of Understanding

“The ultimate test of an explanation is whether it would have made the event predictable in advance.”

The core of the illusion of understanding is that we believe that we understand the past, which implies that the future should also be knowable.

“Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events.”

The Illusion of Validity

“System 1 is designed to jump to conclusions from little evidence—and it is not designed to know the size of its jumps.” 

The illusion of validity happens when experts believe in their judgments too much. This can happen to anyone, including stock pickers who never outperform the market despite their extensive training in stock picking.

Simple algorithms are better than humans when it comes to predicting many things, and this is because humans tend to add complexity to things.

The planning fallacy: Happens when plans are based on a best-case scenario and not on the outcome of similar projects.

Part 4: Choices

Prospect Theory

People make choices in regard to their reference points or their earlier state relative to the gains and losses that are being evaluated. This runs counter to Bernoulli’s theory, in which you only need to know the state of the wealth to determine its utility.

Cognitive features at the heart of prospect theory:

  • Evaluation is relative to a neutral reference point. Outcomes that are better than the reference points are as gains, while those that are below the reference point are losses
  • The principle of diminishing relativity applies to both sensory dimensions and the evaluation of changes in wealth. The perceived difference between $1000 and $900 is smaller than that of $200 and $100
  • The third principle is loss aversion. Losses loom larger than gains

The Fourfold Pattern

Thinking, Fast and Slow

Source

The Fourfold Pattern of Preferences is a framework that helps us understand how we evaluate prospective gains and losses. It has two mental effects at play: The certainty Effect and the Probability Effect.

Certainty Effect:

  • Quadrant 1: (High probability, big gains). People are willing to accept a less-than-expected value of a gamble to lock in a big gain. For example, if there is a 95% chance of winning a lawsuit and gain $1000,000, most people will opt for an out of court settlement that is close to the figure then risk losing the case
  • Quadrant 4: (Low probability, big loss): People will pay a premium for the certainty of not loosing

Possibility Effect:

  • Quadrant 3: (Low probability, big gains): People over-invest for a minuscule chance at winning the lottery
  • Quadrant 2: (High probability, big losses): People will make desperate gambles in the hope that they will avoid a big loss. For example, people will throw everything to treat a terminal illness