Black Friday Exclusive Offer!
This microbook is a summary/original review based on the book: Thinking, Fast and Slow
Available for: Read online, read in our mobile apps for iPhone/Android and send in PDF/EPUB/MOBI to Amazon Kindle.
If you’ve ever wondered how our minds work, “Thinking, Fast and Slow” will provide the answers. Written by leading psychologist and economist Daniel Kahneman, this compelling work summarizes a lifetime of research. Kahneman won the Nobel Prize in Economic Sciences in 2002 for his work. With his publication, he attempts to give us a language to talk about the workings of our mind and creates compelling insights into why so much of our thinking is based on bias. So, get ready to delve into the wondrous workings of our minds!
To simplify the workings of the mind, we can look at two fictitious systems: System 1 and System 2. These systems influence one another and determine our thinking. System 1 is an automatic way of thinking. It provides us with immediate reactions to and impressions of the world around us. It’s quick and requires no conscious effort.
System 2, on the other hand, is our conscious thinking. It’s the system we use when trying to solve complicated tasks. It’s who we think we are, and it also articulates judgements and makes choices. We can control and modify System 1 through System 2 thinking, at least to an extent, by changing our focus, for example.
System 2 is not always a beacon of rationality. Just because we make a conscious effort to think about something, it doesn’t necessarily mean we’re right. Both systems are always working, but System 1 is almost always guiding our thoughts and actions, usually quite accurately. At the core of System 1 lies associative memory. It’s important to be aware of the subconscious bias that can develop from favoring fast thinking. We believe the story we make up. This leads to overconfidence and highly overrating our own grasp on a certain situation.
System 2 usually operates in a low-effort mode. It functions according to the law of least effort – it’s lazy. Most of the time, the thinking processes don’t take up much energy or brain capacity, and that’s why System 1 usually has the upper hand.
At first, when solving a problem or answering a question, System 1 will attempt to supply the correct answer. Only if that system fails will System 2 be engaged. Intuitive answers come to mind confidently – meaning that we don’t realize when they’re off the mark. This fallacy of System 1 is easily demonstrated by looking at optical illusions.
Consider the bat-and-ball problem as another example. Suppose a bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?
Most likely, your answer was 10 cents. Intuitive, quick – and incorrect! Consider again. If the ball cost 10 cents, that would mean that the bat cost $1.10, making the total price $1.20. So, the correct answer is that the ball actually costs five cents. This shows how easily System 1 thinking can mislead us.
We can differentiate between two states of the mind – cognitive ease and cognitive strain. We experience the former when we read something printed in clear font, when we are in a good mood, or when something has been repeated to us. We experience cognitive strain when we’re in a bad mood, when something’s printed in faint colors or a complicated font, or when it is poorly worded.
This means that we often judge something as true or false based on whether it seems familiar to us or not. Basically, whether we can recall it with cognitive ease.
The fact that our brains favor System 1 thinking can have serious consequences in real life. When we’re in the System 1 thinking mode, we’ll easily believe anything we’re told. That’s why repeating simplistic messages (political slogans, for example) often have such an effect on us that we actually believe them. Interestingly, this could explain why extremist political groups still seem to have so much influence on people.
One effect of favoring System 1 thinking is the fact that we judge people based on our first impressions of them, even when we subsequently acquire evidence to the opposite. This is called the halo effect, meaning we either like or dislike everything about a person immediately, without knowing anything about them.
This means that when you first meet another person, and perhaps their looks or voice are pleasing to you, you are more likely to think favorably of them, even though you know little about them. This creates a subconscious bias in the way we engage with other people. And while the order in which we perceive a person’s character traits often happens by chance, we value our first impressions much more highly than any subsequent revelations.
A similar effect can be observed in priming. How would you complete the following word: SO_P? Your answer would probably change from SOUP to SOAP, whether you just heard the word EAT or WASH. This priming effect means that we retrieve words connected to other words with more ease. The same goes for concepts and ideas. Our actions are often primed by things we may not even be aware of. For example, when we’re smiling, we will find things funnier. And studies have shown that when voting in a school rather than in a different building, education issues will be favored in your decision about who to vote for.
This also means that living in a society focused on money and people primes behavior towards individualism. So the culture in which we live has a significant, if unnoticed, effect on the way we conduct our lives.
To explain why we value monetary gains differently depending on the circumstances, Kahneman developed the prospect theory. It introduced a reference point into the accepted theories of explaining our economic thinking. Basically, it means that we dislike losing and we like winning, so we base many of our decisions on that sentiment, whether they are economically viable, or not.
Another way in which System 1 thinking influences our lives is in the principle of heuristics. This means substituting a difficult question with an easier one, and it happens automatically. It’s a shortcut for our brains, that requires less energy when faced with a difficult question and prevents lazy System 2 from having to work. Consider this question: “How should financial advisers who prey on the elderly be punished?”
Most likely, to answer this question, your brain has automatically simplified it and asked instead: “How much anger do I feel when I think of financial predators?” To then give the answer, System 1 uses intensity matching – the intensity of your anger vs. the intensity of the punishment you find appropriate.
This way of thinking can obviously be quite dangerous, as it prevents us from looking at a question or problem neutrally. And because an answer comes so readily to mind, we often do not recognize complex problems as such. One of the dangers of this is called affect heuristics.
Coined by psychologist Paul Slovic, affect heuristics describes how we let our likes and dislikes determine our beliefs about the world. We are more likely to believe statements and studies that represent our own beliefs. Our emotions have therefore a strong influence on the way we perceive political topics, and anything else in our lives.
Our way of heuristic thinking leads to systematic errors. When we want to determine the frequency of an event, for example, our brain will automatically come up with similar events that we have possibly read or heard about before. We then predict the likelihood of the event based on how easily we can recall a similar one. This is called availability heuristics. So, media coverage of a plane crash can make you more scared of flying for a while. You will usually put more faith in your own experiences than in something you hear from someone else or read in the paper.
Availability heuristics already shows that correctly predicting the frequency of an event is hindered by the way our mind works. Similarly, many of us struggle with interpreting statistics correctly, and this can also be explained by the workings of our minds.
A common mistake when looking at statistics is ignoring the base rate. Instead, our minds always attempt to create a causal story, which leads to the creation of stereotypes. For example, the base rate could tell you that most students enrolled at a university are studying humanities. If then faced with the question of whether a random student is studying humanities, you will determine your answer by the base rate of how many students are studying it. If you are then given additional information about the student, which may not match your stereotype of a humanities student, you will ignore the base rate and jump to a causal conclusion.
Another aspect that often influences our correct interpretation of statistics is denominator neglect. When reading the statistic: “a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability,” you will assess the risk as small. The same information can have quite the opposite effect when formulated in a different way: “One out of every 100,000 vaccinated children will be permanently disabled.”
This immediately brings up the image of a permanently disabled child and makes the risk appear a lot higher. “Low probability events are much more heavily weighted when described in terms of relative frequencies (how many) than when stated in more abstract terms of ‘chances,’ ‘risk,’ or ‘probability’ (how likely),” Kahneman points out.
The way our minds work also has a considerable effect on the way our memory works. In the 1990s, an experiment was conducted with patients who needed a colonoscopy, an extremely painful procedure. Patients were asked to rate their feelings of pain during the experience, and then were asked again after the colonoscopy how painful the treatment had been.
Patient A had a short procedure with a painful end, while Patient B had a longer procedure with a less painful end. Patient A and B both indicated similar pain levels when asked to rate them throughout the procedure. But once it had ended and they were asked to rate their overall pain levels, contrary to expectation, Patient A said he’d had the worst of it, even though the procedure had been much shorter.
This phenomenon is explained by the peak end rule: whatever happens last will be valued more strongly. Similarly, duration neglect means that the length of an experience does not significantly alter how we rate it.
The reason for this is because we have two selves when dealing with experience: the experiencing and the remembering self. Our experiencing self is present in the moment, our remembering self values experiences in retrospect. This means we remember events often much differently than they actually happened, since we consider the overall experience. This also shows how making decisions based on our past experiences can be quite misleading, since we have a distorted sense of memory.
The way we see and experience the world is often highly distorted. The way our brains work, by leaving much of its thinking processes up to the automatic System 1, means that very often it presents us with conclusions that are off the mark. This can be dangerous since it favors the creation of stereotypes and biases and can prevent us from making informed decisions.
Kahneman’s “Thinking, Fast and Slow” is a landmark publication, revolutionizing the way we think about ourselves and our decision-making. It is highly readable and very engaging.
Next time you meet someone new, try and be aware of the subconscious bias towards them by carefully analyzing your automatic reactions.
Daniel Kahneman is a behavioral finance theorist who explains human behavior in risk situations from cognitive science. He obtained a degree in mathematics and psychology from the Hebrew University of Jerusalem in 1954 and a doctorate in psychology from the University of California at Be... (Read more)
on Apple Store and Google Play
of 12min users improve their reading habits
Grow exponentially with the access to powerful insights from over 2,500 nonfiction microbooks.
Start enjoying 12min's extensive library
Don't worry, we'll send you a reminder that your free trial expires soon
Free Trial ends here
Get 7-day unlimited access. With 12min, start learning today and invest in yourself for just USD $4.14 per month. Cancel before the trial ends and you won't be charged.Start your free trial
Now you can! Start a free trial and gain access to the knowledge of the biggest non-fiction bestsellers.