Tag Archives: probability

“Against the Gods” by Peter L. Bernstein

This is a book on the history of risk, mostly on the key developments in the mathematical history of probability theory.  Kind of a weird topic, but interesting enough.  Probability theory is so important across a wide range of other fields.

I was a little distracted with lots of little errors throughout, eg. p. 117 Jacob Bernoulli lived from 1654-1705, but next paragraph says he died at age 80.  P. 78 says 7195 plague deaths during week in Sept 1665, but figure on next page has that many deaths for April 1665.

Before the age of rationality, risk was obviously known.  A dice would sometimes turn up a 1, sometimes a 6.  But apparently not much formal thought was given to why or how risk could be exploited or avoided — it was in the hands of fate, or the up to the “gods” as in this book’s title.

Knowledge was built up little by little. Often there were independent advances tailored to solving a specific problem, eg. calculating gambling odds or mortality tables.  Makes me not feel so bad about not grasping hard subjects immediately; not even the originating geniuses did that from scratch. By the same token, how wonderful it is that books and knowledge transfer exist.  Really, that’s the one key characteristic of modern humanity. Intelligence + society.  But I digress.

There’s a chapter on Galton and regression to the mean, plus the context of the “random walk” of the stock market.  The recommendation is to “ignore short-term volatility and hold on for the long pull… The stock market may be a risky place for a matter of months or even for a couple of years, but the risk of losing anything substantial over a period of five years or longer should be small.”  Ok, easy enough, and squares with our experiences in the USA over the past 100 years or so.

However, regression to the mean is easily spotted in historical data, but then other times just fails to materialize. There is an anecdote from 1959, when stock dividends began to surpass bond yields.  Old hands predicted that the situation would soon reverse to normal (with bonds more of a surety). “I am still waiting. The fact that something so unthinkable could occur has had a lasting impact on my view of life and on investing in particular…has left me skeptical about the wisdom of extrapolating from the past… Never depend on <regression to the mean> to come into play without constantly questioning the relevance of the assumptions that support the procedure.”

This reaches a conclusion in the chapters on Frank Knight and Keynes.  They seem to be saying that all previous probability theory is not applicable to real world situations with irrational humans involved, because all probability-based forecasting is based on a set of past data.  Past results impose no guarantee on future behavior when non-deterministic humanity is involved.

There is always uncertainty: “Under conditions of uncertainty, the choice is not between rejecting a hypothesis and accepting it, but between reject and not-reject.  You can decide that the probability that you are wrong is so small that you should not reject the hypothesis.  You can decide that the probability the you are wrong is so large that you should reject the hypothesis.  But with any probability short of zero that you are wrong – certainty rather than uncertainty – you cannot accept a hypothesis.”

Some treatment of Kahneman, including the loss aversion and the endowment effect – “our tendency to set a higher selling price on what we own than what we would pay for the identical item if we did not own it.”

Finally a brief treatment of portfolio insurance, options, and other derivatives.  I read somewhere that economic crashes are rarer now due to fancy risk mitigation devices such as these, but they make the crashes which come much more severe.


“Trading in the Zone” by Mark Douglas

This is a book on trading psychology.  Too many traders are governed by emotion (both good, like euphoria and a feeling of invincibility; and bad, like fear and greed) which prevent them from being consistent winners.  The key is to look at trading objectively, from a probability standpoint.  You must accept that the probability of a win is never 100%.

Fear manifests itself when we, either consciously or subconsciously, avoid information which would “prove us wrong.”  Eg. we avoid positive news about a market you already exited (because you would have to admit you exited too soon) or we avoid negative news about a current trade (especially one that’s already a loser that we hope will “bounce back” soon).

Consistent winning can be problem too, if we get a “can’t lose” attitude and become reckless with larger and larger trades.

There is always going to be uncertainty.  The key is to find a strategy that gives an edge, and then don’t worry if it sometimes is a loser – account for that.  Before every trade, predefine: risk (probabilities of up/down), loss-cutting point, profit-taking point.  Don’t emotionally consider recent wins or losses.  Over and over again trade when you see your edge (only risking some predetermined, small percentage of your equity) and don’t worry when you sometimes lose; just make sure your edge wins on average.  Sounds like he is advising traders to be like an automated algorithm!

But … (the big but) how do you find an edge???  He doesn’t really go into that at all; it seems his intended audience are technical analysts who already have an edge but fail to use it consistently.  For those without, well… find one with quantopian?

I like his approach to “scaling out” profits.  He reports noticing that 1 in 10 trades go down and hit his initial stop immediately.  Another 2-3 in 10 go up a few ticks but then go down to the stop.  What to do = scale out of trade gradually.  When up a few ticks, sell 1/3 of position.  At some other predefined rise (something higher than a few ticks), sell another 1/3 and reset your stop on the remaining 1/3 to your entry position.  Now you have already captured some profit and have a “risk-free” position to see how it turns out.


“Thinking, Fast and Slow” by Daniel Kahneman


Daniel Kahneman is a psychologist who studies decision making, and won a Nobel Prize recently in Economics.  This book is a lay-person summary of common decision-making flaws we humans tend to make over and over again.  In the conclusion, Kahneman says his intent is to provide a new vocabulary, so that we can recognize these flaws (usually in our fast-thinking, instinctive “System 1”) and allow our more-rational “System 2” (the slow thinking one) to appropriate assess things.

Lots of interesting stuff, with plenty of examples.  Here are some notes I took while reading; at the end I’ll put in my final two cents.

  • “Many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.” There’s an academic put-down for you…
  • Anchor effect and anchoring index.  We are influenced by an initial number or idea we see.  This is partially why sale prices work.  “What would you expect to pay for this set of knives?  $100?  Well, we will let you have them today for only $59.95!”  Without even knowing anything about the knives, this already seems like a deal.  (unless you are as jaded towards blatant marketing as I am.)
  • Availability bias – we tend to estimate the likelihood of some event by how readily examples come to mind.  Related to “availability cascade”: we hear about terrorist attacks often, but they are actually quite rare compared to other dangers, like car accidents.
  • Substitution – an answer-to-easy question (usually a query of our gut feeling about something) serves as answer to harder question.  “How much should I donate to save the whales?” –> “How much do I like whales?”  And then “System 1” (fast thinker) is good at translating between scales.
  • “Probability neglect” –  People are bad at dealing with small risks – either completely ignore or grossly overestimate.
  • Illusion of validity – nearly impossible to predict future outcomes in almost all areas, yet we are still confident of our own abilities or those of other seemingly-competent professionals.  Algorithms, stats on past similar situations are often better than intuition.
  • “Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck.”
  • “The outcome of a startup depends as much on the achievements of its competitors and on changes in the market as on its own efforts.”
  • Experts tend to ignore uncertainty: “Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients.”. Doctors had about 60% correct diagnoses compared to autopsies in one study.
  • Prospect theory – people are generally loss averse.  However, we are risk averse regarding possible gains (eg. lotteries), but become risk seeking when only bad options are available.  Rather than accept a loss, we often opt for a gamble with high probability of an even larger loss – too enticed by “it’s a long shot but if it works all our troubles will be over!”
  • Additional findings of prospect theory: we pay a premium for certainty of gains or certain avoidance of losses.
  • Kind of a different topic in the last section of the book, but interesting finding – we value our memory of something (or anticipated memory) more than the actual event.  Eg. excessive picture taking by tourists.  “Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”

Ok, my thoughts now.  The cure for emotion-based loss aversion is to think in and act on probabilities.  Avoid opportunities to second guess probabilistically rational decisions.  An example of sub-rational behavior in the face of probabilities:

Would you rather choose:

a) 95% chance to win $100

b) $80

The sure thing, b), seems like a good choice.  Even when you “know” that the expected value of a) is $95.  Our decisions are (sometimes) a sequence of such choices; when we always pick the sure thing we are short changing our selves over the long run.  “You win some, you lose some” is a winning attitude — when coupled with sound probabilities, of course.

And there in lies the real problem, I think.  We lean towards preferring the sure thing because in real-world decisions, we often don’t know the true probabilities of any but trivial or “toy” decisions.