Thinking, Fast and Slow - Daniel Kahneman

Thinking, Fast and Slow - Daniel Kahneman

Intuition or deliberation? Where you can (and can't) trust your brain

Thinking, Fast and Slow by Daniel Kahneman

Buy book - Thinking, Fast and Slow by Daniel Kahneman

What is the subject of the book Thinking, Fast and Slow?

Think Fast and Slow (2011), a book written by Daniel Kahneman that serves as a recapitulation of the decades of research that led to his Nobel Prize, explains his contributions to our current understanding of psychology and behavioral economics. Kahneman was awarded the Nobel Prize in economics in 2010. Over the years, Kahneman and his collaborators, whose work is extensively discussed in the book, have made important contributions to the advancement of our knowledge of the human brain. The process by which choices are formed, why some judgment mistakes are so frequent, and how we may improve ourselves are now well understood.

Who is it that reads the book Thinking, Fast and Slow?

  • Anyone who is interested in how our brains function, how we solve issues, how we make decisions, and what vulnerabilities our minds are susceptible to should read this book.
  • Anyone who is interested in Nobel Prize laureate Daniel Kahneman's contributions to psychology and behavioral economics, as well as how those achievements relate to society as a whole, should read this book.

Who is Daniel Kahneman, and what does he do?

Daniel Kahneman, PhD, was awarded the Nobel Prize in Economics in 2002 for his research. At the Woodrow Wilson School of Public and International Affairs, he is a Senior Scholar. He is also a Professor of Psychology and Public Affairs Emeritus at the Woodrow Wilson School, Eugene Higgins Professor of Psychology Emeritus at Princeton University, and a fellow of the Center for Rationality at the Hebrew University of Jerusalem.

A tale of two minds: how our actions are influenced by two distinct systems - one that is automatic and the other that is considered.

 In our thoughts, there is a fascinating drama unfolding, a film-like storyline with two major characters that is full of twists, turns, drama, and suspense. System 1 is the impulsive, automatic, intuitive System 1, and System 2 is the thinking, methodical, and calculating System 2 are the two characteristics. As they compete against one another, their interactions influence the way we think, make judgements and choices, and behave as a result of our experiences. System 1 is the portion of our brain that works instinctively and abruptly, and which often operates without our conscious knowledge or permission. It is possible to encounter this system at work if you are exposed to an extremely loud and unexpected sound. What are you going to do? You most likely transfer your attention to the sound very quickly and instinctively after hearing it. System 1 is comprised of the following components:

This mechanism is a remnant of our evolutionary past: being able to do such quick actions and make such quick decisions has intrinsic survival benefits in today's world. System 2 is the portion of the brain that comes to mind when we think of the part of the brain that is responsible for our own decisions, thinking, and beliefs. In this section, we'll talk about conscious actions of the mind including self-control, decision-making, and more intentional concentration of attention.

Consider the following scenario: you're searching for a lady in a crowd. Your mind consciously concentrates on the job at hand: it remembers features of the person in question as well as anything else that may be useful in locating her. This concentration aids in the elimination of possible distractions, and you are hardly aware of the presence of other individuals in the throng. If you keep this level of concentrated attention, you may be able to locate her in a matter of minutes, but if you get distracted and lose concentration, you may have difficulty locating her. In the following notes, we'll see that how we act is determined by the connection between these two systems.

In this article, we will discuss how laziness may lead to mistakes and impair our intellect.

 Try to solve the following classic bat-and-ball issue to observe how the two systems compare: A bat and ball will set you back $1.10. The bat is one dollar more expensive than the ball on the market. How much does the ball set you back? Your first thought, $0.10, was most likely a product of your intuitive and instinctive System 1, and it was completely incorrect! Take a second and run the numbers through your head right now. Do you realize what you've done wrong? The right answer is $0.05, as stated above. That is, your impulsive System 1 seized over and immediately responded by relying on intuition rather than logic. However, it responded too quickly. Normally, when confronted with a scenario it is unable to understand, System 1 calls on System 2 to solve the issue; but, in the bat-and-ball dilemma, System 1 is deceived by the situation. It erroneously sees the issue as being simpler than it really is, and it wrongly believes that it is capable of dealing with it on its own.

The difficulty that the bat-and-ball dilemma reveals is that we are born with a tendency to be mentally lazy. When we utilize our brains, we have a tendency to use the smallest amount of energy possible to complete each job. This is referred to as the rule of the least amount of effort. Because verifying the answer using System 2 would need more energy, our mind will not do so when it believes it can get away with only using System 1 to get the answer. This is a shame, since utilizing System 2 is an essential part of our intellect, and we should not be so lazy. According to research, practicing System-2 activities, such as concentration and self-control, may result in better intelligence ratings. This is shown by the bat-and-ball issue, in which our brains might have verified the solution using System 2 and avoided making this frequent mistake. We restrict the power of our intellect by being sluggish and avoiding utilizing System 2, which is a part of our thinking.

Why we are not always in conscious control of our thoughts and behaviors is explained by the term "autopilot."

 When you see the word fragment “SO P,” what do you think of first? Most likely, nothing. If you start with the word "EAT," what do you think will happen? Now, if you were to look at the word "SO P" again, you would most likely finish it with the letters "SOUP." Priming is the term used to describe this procedure. When we are exposed to a word, idea, or event that leads us to recall similar words and concepts, we are said to be primed. If you had seen the word "SHOWER" instead of the word "EAT" above, you would most likely have finished the letters with the word "SOAP." Such priming has an impact not just on the way we think, but also on the way we behave. Hearing particular words and ideas may have an impact on the mind, but the body can also be influenced by what is heard. Participants in a research who were primed with terms linked with being old, such as "Florida" and "wrinkle," reacted by walking at a slower speed than normal, which is an excellent illustration of this.

To our amazement, the priming of behaviors and ideas occurs totally unconsciously; we do it without even aware we are doing it. In conclusion, priming demonstrates that, contrary to popular belief, we are not always in conscious control of our behaviors, judgements, and decisions. Instead, we are being prepared on a continuous basis by specific social and cultural circumstances. For example, research conducted by Kathleen Vohs demonstrates that the idea of money motivates individuals to behave in a selfish manner. Money-motivated individuals, such as those who have been exposed to pictures of money, are more autonomous in their actions and are less likely to get connected with, rely on or accept demands from others. It is possible that living in a society populated with triggers that prime money will cause us to deviate from our natural tendency toward benevolence, as shown by Vohs's study.

Priming, like other social factors, has the potential to influence an individual's ideas and, as a result, choices, judgment, and conduct - all of which have the potential to reflect back into the culture and have a significant impact on the type of society in which we all live.

Snap judgements are the process through which the mind makes fast decisions even when there is insufficient information to make a logical conclusion.

 Consider the following scenario: you meet someone called Ben at a party and you find him to be easy to speak to. Later on, someone approaches you and asks whether you know of anybody who may be interested in making a donation to their organization. Although all you know about Ben is that he is kind and easy to speak to, you find yourself thinking about Ben. In other words, you loved one element of Ben's personality and so thought you would appreciate everything else about him. This is incorrect. Even when we don't know much about a person, we may form an opinion about them based on our perceptions. The propensity of our minds to oversimplify situations when there is little information results in many judgment mistakes. Known as excessive emotional coherence (also known as the halo effect), this is when you have a halo around someone because you have favorable emotions about their approachability, even if you know very little about them or vice versa.

However, this is not the only manner in which our brains use shortcuts while making decisions. Also present is the phenomenon of confirmation bias, which is the propensity for individuals to agree with information that confirms their already held views, as well as the inclination for them to accept whatever information is presented to them. To demonstrate this, we may pose the question, "Is James pleasant to be around?" We are highly likely to consider James nice if we are just presented with one question and no additional information, since the mind immediately reinforces the proposed notion, according to recent research.

The halo effect and confirmation bias both arise as a result of our brains' need to make snap judgements in the moment. However, this often results in errors since we do not always have enough information to make an informed decision. In order to fill in the gaps in the data, our brains depend on misleading suggestions and oversimplifications, which may lead us to draw possibly incorrect conclusions. These cognitive processes, such as priming, take place without our conscious knowledge and have an impact on our decisions, judgements, and actions.

Heuristics are mental shortcuts that the mind employs to make fast judgments.

 Most of the time, we find ourselves in circumstances where we must make a split second decision. Our brains have evolved tiny shortcuts to assist us in quickly comprehending our environment in order to assist us in doing so. These are referred to as heuristics. While these processes are generally beneficial, the problem is that our brains have a tendency to misuse them in certain situations. It is possible to make errors when we use them in circumstances for which they are not intended or appropriate. We may explore two of the numerous kinds of heuristics available to us in order to get a better grasp of what they are and what errors they might cause: the replacement heuristic and the availability heuristic. When we use the replacement heuristic, we answer a question that is simpler to answer than the one that was really asked.

Take, for example, the following question: "That lady is running for sheriff." “How successful will she be in her new position?” We immediately replace the question we're meant to answer with a simpler one, such as, "Does this lady seem to be someone who will make a good sheriff?" or "Does this woman appear to be someone who will make a good sheriff?" The benefit of using this heuristic is that, rather than studying the candidate's history and policies, we may just ask ourselves if this lady fits our mental picture of a good sheriff. Unfortunately, if a woman does not match our preconceived notions of what a sheriff should look like, we may reject her — even if she has years of law enforcement expertise that makes her an excellent candidate. In addition, there's the availability heuristic, which occurs when you overestimate the likelihood of something that you hear often or that you find easy to recall.

In contrast to accidents, strokes cause much more fatalities than accidents, yet according to one research, 80 percent of respondents believed that an accidental death was a more probable outcome. Due to the fact that we hear about unintentional fatalities more often in the media and that they leave a greater impact on us, we recall terrible accidental deaths more easily than deaths from strokes, and as a result, we may respond improperly when faced with a hazard of this kind.

Why humans have a hard time understanding statistics and make preventable errors as a result of our lack of numerical aptitude.

 What methods do you use to forecast whether or not specific events will take place? One successful strategy is to keep the base rate in mind at all times. This term refers to a statistical foundation upon which additional statistics are built. Consider the following scenario: a major taxi business has a fleet of taxis that is 20 percent yellow and 80 percent red. In other words, the base charge for yellow taxi taxis is 20 percent lower, while the basic rate for red taxi cabs is 80 percent higher. If you order a taxi and want to estimate what color it will be, keep in mind the basic prices and you will be able to make a pretty accurate guess. Because of this, we should constantly keep the base rate in mind while making predictions, but sadly this does not always happen. In reality, base-rate neglect is very prevalent in the financial world.

One of the reasons we tend to ignore the base rate is because we are more concerned with what we anticipate than with what is most probable. Consider, for example, those cabs from before: In the event that you saw five red cabs drive by in a row, you would undoubtedly begin to believe that the following one would be yellow, just for the sake of variety. However, no matter how many taxis of either hue pass by, the likelihood that the next cab will be red will remain around 80 percent — and if we recall the base rate, we should be aware of this. Instead, we prefer to concentrate on what we anticipate to see, which is a yellow taxi, and as a result, we are more than likely to be mistaken.

Base-rate neglect is a frequent error that may be traced back to the broader issue of dealing with statistics in general. We also find it difficult to recall that everything eventually returns to the mean. This is the recognition that all circumstances have an average state, and that deviations from the average will ultimately lean back toward the average. Suppose a football striker who scores five goals per month scores ten goals in September; however, if she then goes on to score around five goals per month for the rest of the year, her coach will likely criticize her for not continuing her "hot streak" and will fire up his or her teammates to praise her. The striker, on the other hand, would not merit this criticism since she is just regressing to the mean!

Past imperfection: Why do we recall events from a retrospective perspective rather than from firsthand experience?

 Unlike our bodies, our brains do not recall events in a linear fashion. We have two distinct mechanisms, collectively referred to as memory selves, each of which remembers events in a different way. The experiencing self is the first of them, and it is responsible for recording how we are feeling in the current moment. This one poses the question, "How do you feel right now?" In addition to this, there is the remembering self, which keeps a record of how the whole experience transpired after it has occurred. It inquires, "How did you find the experience overall?" Because our emotions during an event are always the most true, the experiencing self provides a more accurate description of what happened. The remembering self, on the other hand, which is less accurate because it records memories after the event has ended, dominates our ability to recall information.

There are two reasons why the remembering self has a stronger grip on the experiencing self than the experiencing self does. The first of them is known as duration neglect, and it refers to the practice of ignoring the entire length of an event in favor of a specific memory associated with it. The second guideline is the peak-end rule, which states that we should place more emphasis on what happens towards the conclusion of an event. Take, for example, this experiment, which assessed people's recollections of a painful colonoscopy to demonstrate the supremacy of the remembering self. Before the colonoscopy, the patients were divided into two groups: those in one group had lengthy, drawn-out colonoscopies, while those in the other group received relatively shorter procedures, with the degree of discomfort increasing near the conclusion of each operation.

You'd assume the most dissatisfied patients would be those who had to undergo the lengthier procedure since their pain had to be endured for a longer period of time. This was, without a doubt, how they felt at the moment. When questioned about their discomfort throughout the operation, each patient's experiencing self provided an appropriate response: those who underwent the lengthier procedures felt worse. Those who went through the shorter procedure with the more painful conclusion, on the other hand, felt the worst after the event, when the remembering self took over and took over. This study provides a clear demonstration of duration neglect, the peak-end rule, and the limitations of human memory.

It is possible to have a significant impact on our ideas and actions by changing the focus of our minds.

 Depending on the job at hand, our brains use varying amounts of energy. A condition of cognitive ease is achieved when there is no need to mobilize attention and minimal energy is required to do the required tasks. Our brains, on the other hand, use more energy when they are required to activate attention, resulting in a condition of cognitive strain. These fluctuations in the energy levels of the brain have a significant impact on our behavior. We are in a condition of cognitive comfort when our intuitive System 1 is in command and the logical and more energy-demanding System 2 is under-utilized. This implies that we are more intuitive, creative, and happy, but we are also more prone to making errors as a result of this.

When we are under cognitive pressure, our awareness is more acute, and System 2 takes over as the primary decision-maker. System 2 is more likely than System 1 to double-check our judgements, thus even if we are much less creative, we will make less errors as a result of this. You have the ability to deliberately control the amount of energy the mind expends in order to get into the proper frame of mind for certain activities. Try to promote cognitive comfort in your messages if you want them to be convincing, for instance. One method of doing this is to subject ourselves to repeated information. When information is repeated to us or is made more remembered, it has a greater chance of becoming persuasive. This is due to the fact that our brains have evolved to respond favorably when they are constantly exposed to the same straightforward information. A feeling of cognitive ease is achieved when we observe something we are acquainted with.

When it comes to things like statistical issues, cognitive strain, on the other hand, is advantageous. When we are exposed to information that is presented in a confusing manner, such as via difficult-to-read font, we may find ourselves in this condition. In an attempt to understand the issue, our brains brighten up and boost their energy levels, and as a result, we are less inclined to just quit up.

When it comes to taking risks, the way probabilities are given to us has an impact on our assessment of risk.

 The manner in which ideas and issues are presented to us has a significant impact on how we evaluate them and handle them. Even little modifications to the specifics or emphasis of a statement or question may have a significant impact on how we respond to it. A good illustration of this may be found in the way we evaluate risk. You may believe that after we have determined the likelihood of a risk happening, everyone will handle it in the same manner. This is not always the case. This, however, is not the case. Even for properly estimated probabilities, just altering the way the number is presented may have a significant impact on how we approach the problem at hand. For example, when a rare occurrence is described in terms of relative frequency rather than statistical probability, people are more inclined to believe that it will occur.

In what is known as the Mr. Jones experiment, two sets of psychiatric experts were asked whether it was safe to release Mr. Jones from a mental institution after he had been involuntarily committed. The first group was told that patients like Mr. Jones had a "10 percent probability of committing an act of violence," and the second group was told that "of every 100 patients similar to Mr. Jones, it is estimated that 10 will commit an act of violence." The results of the study were published in the journal Psychological Science. The second group received almost twice as many responses as the first group, indicating that he was not discharged. An other method of diverting our attention away from what is statistically significant is referred to be denominator neglect. This happens when we choose to overlook simple data in favor of vivid mental pictures that affect our decision-making processes.

To illustrate, consider the following two statements: "This medication protects children against illness X, but there is a 0.001 percent risk of permanent disfigurement" and "One in every 100,000 children who takes this drug will be permanently scarred." Despite the fact that both statements are equivalent, the latter statement conjures up images of a deformed kid and is much more persuasive, which is why we would be less inclined to give the medication in this case.

Why we are not robots: why we do not make decisions only on the basis of logic and reason.

 What factors influence our decision-making as individuals? We should make choices entirely on the basis of logical reasoning, according to a prominent and strong group of economists who held that position for a long period. We all make decisions in accordance with utility theory, which says that when people make decisions, they look solely at the logical facts and select the alternative that provides them with the best overall result, i.e., the most utility. If you prefer oranges over kiwis, for example, utility theory would suggest that you would prefer a 10% chance of winning an orange over a 10% chance of winning a kiwi in a lottery. It seems to be self-evident, doesn't it?

The Chicago School of Economics and its most well-known professor, Milton Friedman, were the most prominent group of economists in this area at the time. In the Chicago School, economist Richard Thaler and lawyer Cass Sunstein claimed that people in the marketplace are ultra-rational decision-makers, whom they subsequently coined the term Econs to refer to them. As Econs, each person behaves in the same manner, valuing products and services in accordance with their reasonable requirements. Furthermore, Econs place a reasonable value on their riches, considering just how much benefit it gives them in exchange for their wealth. As an example, consider the case of two individuals, John and Jenny, who both have a $5 million wealth. According to utility theory, they have the same amount of money, which implies that they should both be equally satisfied with their financial situations.

But what if we make things a bit more complicated? Assume that their $5 million riches are the end result of a day at the casino, and that their starting positions were drastically different: Jenny came in with $9 million and saw her money decrease down to $5 million, while John went in with just $1 million and saw his money more than fivefold increase. Consider if John and Jenny are still equally content with their $5 million fortune. Unlikely. Clearly, there is something more to the way we value things than just their usefulness, as shown above. Given that we do not all perceive value in the same way that utility theory suggests, we may make odd and apparently illogical choices, as we will show in the next section.

Why, rather than making choices based entirely on logical reasons, we are often influenced by emotional elements rather than intellectual ones

 If utility theory isn't effective, what else is there to try? Prospect theory, which was created by the author, is one option. According to Kahneman's prospect theory, humans don't always make the most logical decisions when faced with a choice, which calls into question utility theory. Consider the following two situations, for example: To begin, you are awarded $1,000 and asked to choose between getting a guaranteed $500 or accepting a 50 percent chance to win an additional $1,000 in the following scenario: In the second scenario, you are handed $2,000 and must then pick between a guaranteed loss of $500 or a 50 percent risk of losing $1,000 by betting on the outcome. If we were to make completely logical decisions, we would choose the same decision in both situations. However, this is not the case. Those that want a sure bet pick the first option, while those who prefer a risk choose the second option, and so on.

Prospect theory contributes to the understanding of why this is the case. It draws attention to at least two reasons why humans do not always behave in a logical manner. Both of them are characterized by our loss aversion, which refers to the fact that we dread losses more than we appreciate benefits. The first reason is because we place a monetary value on things based on their relationship to other things. In each of the two situations, beginning with $1,000 or $2,000 alters whether or not we are willing to risk, since the starting point influences how much we value our position. Because the reference point in the first situation is $1,000 and the reference point in the second scenario is $2,000, coming up with $1,500 seems like a victory in the first scenario but a disagreeable loss in the second. We comprehend worth as much by our starting point as we do by the real objective value at the moment, despite the fact that our reasoning is obviously illogical in this case.

Second, we are affected by the concept of decreasing sensitivity, which states that the value we perceive may be different from the real worth of the object in question. For example, losing $1,000 but only receiving $900 does not feel as terrible as losing $200 but only receiving $100, despite the fact that the monetary worth of both losses is the same. The perceived value lost when moving from $1,500 to $1,000 is higher than the perceived value lost when going from $2,000 to $1,500, in our case as well.

Why the mind constructs comprehensive pictures to describe the world, yet these representations lead to overconfidence and errors is the subject of this article.

 In order to comprehend circumstances, our brains naturally use cognitive coherence; we create full mental images to explain ideas and concepts to ourselves and others. For example, when it comes to the weather, we have a plethora of mental pictures. Let us consider summer weather, for example. We may have an image in our minds of the brilliant, hot sun showering us in warmth and light. In addition to assisting us in comprehending information, we depend on these pictures while making decisions about our lives. When we are making choices, we return to these illustrations and base our assumptions and conclusions on what we have learned from them. Example: If we are looking for summer clothing, we base our choices on our mental picture of the weather that will be present throughout that season.

The issue is that we put much too much trust in these representations of reality. Even when the accessible facts and evidence contradict our mental visions, we continue to follow our instincts and allow our imaginations lead us. You may walk out in shorts and a T-shirt in the summer even if the weather forecaster has predicted rather chilly weather; this is because your mental picture of summer instructs you to do so. It's possible that you'll wind up shivering outdoors! In a nutshell, we are enormously overconfident in our frequently erroneous mental representations. However, there are methods for overcoming this overconfidence and beginning to make better forecasts.

One method of avoiding errors is to make use of reference class forecasting techniques. In order to create more accurate forecasts, instead of basing your decisions on your fairly broad mental pictures, utilize particular historical instances to help you make better decisions. Consider the last time you went out on a brisk summer day, as an example. What were you wearing at the time? Additional options include developing a long-term risk strategy that includes particular contingency plans in the event of both success and failure in predicting. Preparation and protection allow you to depend on facts rather than broad mental images when making forecasts, allowing you to create more accurate predictions. With regard to the weather, this might include packing an extra sweater simply to be on the safer side of things.

The book Thinking, Fast and Slow comes to a close with a synopsis.

The central idea of this book is that our brains are comprised of two systems, as shown by Thinking, Fast and Slow. In contrast to the first, which is intuitive and takes little work, the second is intentional and demands a significant amount of our focus. It is dependent on which of the two systems is in control of our brain at any one moment that our thoughts and behaviors change. Advice that can be put into action Reiterate your message! When we are exposed to messages on a regular basis, they become more compelling. This is most likely due to the fact that humans developed in a manner that made frequent exposure to items that had no negative effects seem fundamentally beneficial. Don't be swayed by infrequent statistical occurrences that are over-reported in the media. Historically significant disasters and other catastrophes have occurred, but we tend to overestimate their statistical likelihood because of the vivid pictures we connect with them from the media. Being in a better mood allows you to be more creative as well as perceptive. When you're in a good mood, the portion of your brain that is vigilant and analytical tends to relax a little. The more intuitive and faster thinking system gains control of your mind as a result, which also increases your ability to be more creative.

Buy book - Thinking, Fast and Slow by Daniel Kahneman

Written by BrookPad Team based on Thinking, Fast and Slow by Daniel Kahneman

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.