4,4
PDF The Undoing Project: A Friendship That Changed Our Minds with Free MOBI EDITION Download Now!
“Brilliant. . . . Lewis has given us a spectacular account of two great men who faced up to uncertainty and the limits of human reason.” —William Easterly, Wall Street JournalForty years ago, Israeli psychologists Daniel Kahneman and Amos Tversky wrote a series of breathtakingly original papers that invented the field of behavioral economics. One of the greatest partnerships in the history of science, Kahneman and Tversky’s extraordinary friendship incited a revolution in Big Data studies, advanced evidence-based medicine, led to a new approach to government regulation, and made much of Michael Lewis’s own work possible. In The Undoing Project, Lewis shows how their Nobel Prize–winning theory of the mind altered our perception of reality.
At this time of writing, The Audiobook The Undoing Project: A Friendship That Changed Our Minds has garnered 10 customer reviews with rating of 5 out of 5 stars. Not a bad score at all as if you round it off, it’s actually a perfect TEN already. From the looks of that rating, we can say the Audiobook is Good TO READ!
PDF The Undoing Project: A Friendship That Changed Our Minds with Free MOBI EDITION!
I tend to get excited when the best story tellers write a new book, and when the book covers a topic I have been focused on recently this is even more so. Such was the case when Lewis covered the Nobel Prize winning duo of Daniel (Danny) Kahneman and Amos Tversky, two psychologists who developed much of the base work behind behavioral finance. I also see its roots in Robert Cialdini’s books (Influence, Pre-suasion). While Kahneman (Tversky died in 1996 so did not share in the Nobel) wrote Thinking Fast and Slow to share their life work, here Lewis tries to identify why they worked so well together. In fact I don’t recall him talking about thinking fast (immediate response) or slow (long-term investor) at all. It reads somewhere between biography and non-fiction about behavioral finance.Some of the more interesting thoughts in the book have nothing to do with behavioral finance, but have lots to do with psychology. Lewis discusses Daryl Morey, who I would call a basketball sabermatrician. He has been GM for the Houston Rockets since 2007 using tactics similar to those described for baseball in the book Moneyball (also by Lewis). In this same chapter Lewis provides a definition of a nerd – a person who knows his own mind well enough to mistrust it. This sounds like something Charlie Munger would say (high praise).Both Kahneman and Tversky lived in Israel, where everyone serves a stint in the military, and both saw action in the Six Day War in 1967 and the Yom Kippur War in 1973 (when they returned from America to take up arms). Kahneman helped the Israelis design better tools for selecting officers and training pilots. Tversky was a paratrooper. Both were professors at Hebrew University at the beginning of the first war.Our mind tricks us. After the fact, we know exactly why we saw the event coming that no one anticipated (see Taleb’s Black Swan) and surveys weigh more heavily toward events that have recently occurred. The reasons often given relate back to our days as prey on the plains of Africa (thinking fast keeps you alive in that context – you run away from a predator, as fast as you can). One of the ways to catch these inconsistencies is to devise three options, where a person chooses A over B, B over C, and C over A. This violates the law of transitivity, familiar to anyone who has ever studied algebra or logic. Lewis provides many of these examples, as did Kahneman in Thinking Fast and Slow, and I fall for nearly every one. Even after I’ve seen them before (sometimes, occasionally, I remember).K/T developed several heuristics, where laws of chance are replaced by rules of thumb. “We often decide that an outcome is extremely unlikely, or impossible, because we are unable to imagine any chain of events that could cause it to occur. The defect, often, is in our imagination.”• Representativeness – we see a previously developed mental model rather than thinking through the facts as presented (and are generally correct). This creates systematic errors, such as looking at a kid and immediately deciding whether they are athletic. Looking at the negative can help avoid these problems. For example, the WW2 bombs landing in London appeared to target certain areas, but really were random. If you have 23 randomly selected people in a room, the odds are better than half that at least two share a birthday.• Availability – we more easily recall memorable events.• Conditionality – we make contingent assumptions when none are stated. We assume normal operating conditions (e.g., normal distribution, VaR). “…people don’t know what they don’t know, but that they don’t bother to factor their ignorance into their judgments.”• Anchoring (and adjustment) – if you are shown a large (or small) number, for example, then your response is then large (or small).• Simulation – what could happen dominates what is likely to happen – this can lead to analysis paralysis (I find it difficult to overcome this when investing for my personal accounts – it’s hard to pull the trigger).• Recency bias – recent events influence our probability assumptions.• Hindsight bias - once we know how something turns out, our recollection is that we predicted it in advance (similar to Black Swans – Taleb)How do ideas form in our mind? Is it conscious, or indirect? When we study in school, or for a credential, the focus is on repeating the “right” answer. While hard to grade, I’ve always thought it would be better to provide an answer and ask the student to improve it.Who knew that a bad experience could be remembered more fondly if the final part of the event was not so distasteful – the peak-end rule? This was tested using colonoscopies that ended with the medical instruments brought out of the body slowly or quickly. Doing so slowly made it more likely that the person would return for future tests.The risk manager will discover, usually the hard way, that avoiding a risk receives no reward but if you miss a risk then you will get the blame. This is a human bias.Accounting does not consider the impact on the environment, to limited supply, or to emotions. Utility theory overstates the value. Risk aversion is a fee willingly paid to avoid regret. In any case we all prefer to avoid pain more than we want to secure gain. We react more to relative changes than absolute ones, and probability is not straightforward.The benefits of a group often conflict with the benefit to an individual. Antibiotics are such an example. In total, limiting antibiotics is better because viruses have less chance to mutate successfully. For an individual, antibiotics are either useful or neutral. There is no downside to an individual to being treated with antibiotics.One of the fascinating revelations in the book (for me, at least) was the need to invert. “How do you understand memory? You don’t study memory. You study forgetting.” As we study other topics we should look for opportunities to utilize this strategy.While much of the interest in this branch of psychology is applied to investment strategies, K/T worried more about geopolitical biases and the series of avoidable mistakes that could be made by political leaders relying on gut feel. They thought that intelligence reports written as essays should be replaced by probabilities. Telling a story is not helpful in this context, but politicians tend to be afraid of numbers. We have seen evidence of this recently as briefings to the US president are said to be focused on charts and short sound bites.As you read about financial economics, this should not be your first book. I believe it is more useful to someone already familiar with the concepts from other sources. For someone starting out on this topic I personally like Why Smart People Make Big Money Mistakes and How to Correct Them by Gary Belsky and Thomas Gilovich to start and then Thinking Fast and Slow by Kahneman before reading the Lewis book.
Post a Comment