Daniel Kahneman is a psychologist who studies decision making, and won a Nobel Prize recently in Economics. This book is a lay-person summary of common decision-making flaws we humans tend to make over and over again. In the conclusion, Kahneman says his intent is to provide a new vocabulary, so that we can recognize these flaws (usually in our fast-thinking, instinctive “System 1”) and allow our more-rational “System 2” (the slow thinking one) to appropriate assess things.
Lots of interesting stuff, with plenty of examples. Here are some notes I took while reading; at the end I’ll put in my final two cents.
- “Many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.” There’s an academic put-down for you…
- Anchor effect and anchoring index. We are influenced by an initial number or idea we see. This is partially why sale prices work. “What would you expect to pay for this set of knives? $100? Well, we will let you have them today for only $59.95!” Without even knowing anything about the knives, this already seems like a deal. (unless you are as jaded towards blatant marketing as I am.)
- Availability bias – we tend to estimate the likelihood of some event by how readily examples come to mind. Related to “availability cascade”: we hear about terrorist attacks often, but they are actually quite rare compared to other dangers, like car accidents.
- Substitution – an answer-to-easy question (usually a query of our gut feeling about something) serves as answer to harder question. “How much should I donate to save the whales?” –> “How much do I like whales?” And then “System 1” (fast thinker) is good at translating between scales.
- “Probability neglect” – People are bad at dealing with small risks – either completely ignore or grossly overestimate.
- Illusion of validity – nearly impossible to predict future outcomes in almost all areas, yet we are still confident of our own abilities or those of other seemingly-competent professionals. Algorithms, stats on past similar situations are often better than intuition.
- “Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck.”
- “The outcome of a startup depends as much on the achievements of its competitors and on changes in the market as on its own efforts.”
- Experts tend to ignore uncertainty: “Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients.”. Doctors had about 60% correct diagnoses compared to autopsies in one study.
- Prospect theory – people are generally loss averse. However, we are risk averse regarding possible gains (eg. lotteries), but become risk seeking when only bad options are available. Rather than accept a loss, we often opt for a gamble with high probability of an even larger loss – too enticed by “it’s a long shot but if it works all our troubles will be over!”
- Additional findings of prospect theory: we pay a premium for certainty of gains or certain avoidance of losses.
- Kind of a different topic in the last section of the book, but interesting finding – we value our memory of something (or anticipated memory) more than the actual event. Eg. excessive picture taking by tourists. “Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”
Ok, my thoughts now. The cure for emotion-based loss aversion is to think in and act on probabilities. Avoid opportunities to second guess probabilistically rational decisions. An example of sub-rational behavior in the face of probabilities:
Would you rather choose:
a) 95% chance to win $100
The sure thing, b), seems like a good choice. Even when you “know” that the expected value of a) is $95. Our decisions are (sometimes) a sequence of such choices; when we always pick the sure thing we are short changing our selves over the long run. “You win some, you lose some” is a winning attitude — when coupled with sound probabilities, of course.
And there in lies the real problem, I think. We lean towards preferring the sure thing because in real-world decisions, we often don’t know the true probabilities of any but trivial or “toy” decisions.