/ /

  • linkedin
  • Increase Font
  • Sharebar

    Suggested summer reading: The Undoing Project

    A book that examines how humans think explains both our strengths and weaknesses as practitioners.

     

    The representativeness heuristic occurs when probabilities are estimated based on a perception of how closely A is representative of B. While this cognitive short cut is often accurate, it can lead to serious errors because representativeness fails to incorporate other factors that are more germane to accurate probability estimates. Kahneman and Tversky cite as an example leaping to the conclusion that the most likely occupation of someone described as meek and tidy with a need for order and a passion for detail is a librarian. But because there are far more farmers, lawyers, salesmen, etc. than librarians and thus far more individuals with these characteristics among these other professions, such a conclusion is fundamentally flawed. Tversky and Kahneman also detailed factors leading to representativeness errors including: insensitivity to prior probabilities, overweighting of small sample sizes, overestimation of the actual predictability of a given event, and unwarranted confidence in the accuracy of our predictions (ie, the illusion of validity).3 Another cause of representativeness error is failure to appreciate the phenomenon of regression toward the mean. This has the effect of leading to an overestismate of the value of negative incentives and an underestimate of the value of positive incentives.

    A second source of error is the availability heuristic, in which we judge the probability of an event simply based on our ability to recall similar events. This introduces all sorts of biases including overestimating the size of an association based on the ease with which we can retrieve an example. This, in turn can reflect how we search for associations.

    A third source of errors results from use of anchoring heuristics. This error reflects our tendency to make estimates of final results based on initial data, which may be highly skewed. For example, assume you are given 5 seconds to calculate either 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 or 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8. Which set of numbers is more likely to result in a quick estimate closest to the correct calculation of 40,320? Obviously, use of the former set consistently leads to more accurate estimates. Because of anchoring biases, people tend to overestimate the probability of conjunctive (related) events and underestimate the probability of disjunctive (unrelated) events.

    Lewis provides lots of examples to better understand these biases which underpin many new disciplines ranging from evidenced-based medicine to behavioral economics. Chapter 8 addresses the former by telling the story of Dr. Don Redelemier, who studied the cause of preventable errors in a Canadian trauma hospital. After reading Kahneman and Tversky’s article in Science,3 he grasped that these heuristic errors could also be the source of medical errors. As fate would have it, he then served an internal medicine residency at Stanford after Tversky had been recruited to their faculty. A lunch the two men shared sparked a collaboration that helped lay the foundation of evidenced-based medicine. For example, they discovered that physicians made different and often lower value management decisions when considering patients as individuals versus groups.4 They also pointed out that framing is important in accurate medical decision making: the more detailed the description of a condition and the more diagnostic options provided to a physician, the more likely that he or she would reach an accurate diagnosis and management plan (aka the Unpacking Principle).5 Redelemier also met Kahneman, after the latter had begun to explore emotional influences on decision making. Together they published a paper which concluded that patients tended to consider potential losses as more significant than the corresponding gains when choosing management plans, and that patients’ interpretation of events and decision-making was strongly influenced by how situations/data were presented to them or how they were framed.6

    Next: What to say and do when things go terribly wrong

    This understanding of the impact of emotions on decision making also formed the foundation of behavioral economics. Loss aversion is a key concept here as well. In general, people find the loss of X dollars more aversive than they find the gain of X dollars attractive. However, people usually don’t weigh monetary values in simple, dispassionate, numerical terms but rather, assign subjective values or “utilities” to financial outcomes. Thus, risk aversion at a given monetary value tends to decrease with increasing wealth. Lewis details the impact of Tversky’s and Kahneman’s work on the field of behavioral economics through the eyes of Richard Thaler, a University of Chicago behavioral economist who was heavily influenced by their writing and became another one of Kahneman’s collaborators.

    Throughout the book, Lewis returns to Kahneman’s and Tversky’s personal lives, chronicling their trials and tribulations, academic achievements and frustrations, and, over time, the chilling of their relationship caused by time, distance, and professional jealousy. The story concludes with one of them winning the Nobel Prize but I will not spoil the ending.

    Take-home message

    Why is this book so important to read? I believe it gives powerful insights into how we think and why we make errors. Recognizing the sources of human errors may help you avoid them. Understanding how humans think may help you think better. I hope that it will also motivate you to read Kahneman’s New York Times bestseller, Thinking Fast and Slow2 and many of Kahneman’s and Tversky’s elegantly crafted papers. But if for no other reason, read The Undoing Project because it’s just a great book.

    REFERENCES

    1. Lewis, Michael. The Undoing Project: A Friendship That Changes Our Minds. New York, W.W. Norton & Company, 2017.

    2. Kahneman, Daniel. Thinking Fast and Slow. New York, Farrar, Straus and Giroux, 2011.

    3. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974 Sep 27;185(4157):1124-31.

    4. Redelmeier DA, Tversky A. Discrepancy between medical decisions for individual patients and for groups. N Engl J Med. 1990 Apr 19;322(16):1162-4.

    5. Redelmeier DA, Koehler DJ, Liberman V, Tversky A. Probability judgement in medicine: discounting unspecified possibilities. Med Decis making. 1995 Jul-sep;15(3):227-30.

    6. Redelmeier DA, rozin P, Kahneman D. Understanding patients’ decisions. Cognitive and emotional perspectives. JAMA. 1993 Jul 7;270(1):72-6.

    Charles J Lockwood, MD, MHCM
    Dr Lockwood, Editor-in-Chief, is Dean of the Morsani College of Medicine and Senior Vice President of USF Health, University of South ...

    0 Comments

    You must be signed in to leave a comment. Registering is fast and free!

    All comments must follow the ModernMedicine Network community rules and terms of use, and will be moderated. ModernMedicine reserves the right to use the comments we receive, in whole or in part,in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

    • No comments available

    Poll

    Latest Tweets Follow