Tag Archives: psychology

“Trading in the Zone” by Mark Douglas

This is a book on trading psychology.  Too many traders are governed by emotion (both good, like euphoria and a feeling of invincibility; and bad, like fear and greed) which prevent them from being consistent winners.  The key is to look at trading objectively, from a probability standpoint.  You must accept that the probability of a win is never 100%.

Fear manifests itself when we, either consciously or subconsciously, avoid information which would “prove us wrong.”  Eg. we avoid positive news about a market you already exited (because you would have to admit you exited too soon) or we avoid negative news about a current trade (especially one that’s already a loser that we hope will “bounce back” soon).

Consistent winning can be problem too, if we get a “can’t lose” attitude and become reckless with larger and larger trades.

There is always going to be uncertainty.  The key is to find a strategy that gives an edge, and then don’t worry if it sometimes is a loser – account for that.  Before every trade, predefine: risk (probabilities of up/down), loss-cutting point, profit-taking point.  Don’t emotionally consider recent wins or losses.  Over and over again trade when you see your edge (only risking some predetermined, small percentage of your equity) and don’t worry when you sometimes lose; just make sure your edge wins on average.  Sounds like he is advising traders to be like an automated algorithm!

But … (the big but) how do you find an edge???  He doesn’t really go into that at all; it seems his intended audience are technical analysts who already have an edge but fail to use it consistently.  For those without, well… find one with quantopian?

I like his approach to “scaling out” profits.  He reports noticing that 1 in 10 trades go down and hit his initial stop immediately.  Another 2-3 in 10 go up a few ticks but then go down to the stop.  What to do = scale out of trade gradually.  When up a few ticks, sell 1/3 of position.  At some other predefined rise (something higher than a few ticks), sell another 1/3 and reset your stop on the remaining 1/3 to your entry position.  Now you have already captured some profit and have a “risk-free” position to see how it turns out.



“Thinking, Fast and Slow” by Daniel Kahneman


Daniel Kahneman is a psychologist who studies decision making, and won a Nobel Prize recently in Economics.  This book is a lay-person summary of common decision-making flaws we humans tend to make over and over again.  In the conclusion, Kahneman says his intent is to provide a new vocabulary, so that we can recognize these flaws (usually in our fast-thinking, instinctive “System 1”) and allow our more-rational “System 2” (the slow thinking one) to appropriate assess things.

Lots of interesting stuff, with plenty of examples.  Here are some notes I took while reading; at the end I’ll put in my final two cents.

  • “Many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.” There’s an academic put-down for you…
  • Anchor effect and anchoring index.  We are influenced by an initial number or idea we see.  This is partially why sale prices work.  “What would you expect to pay for this set of knives?  $100?  Well, we will let you have them today for only $59.95!”  Without even knowing anything about the knives, this already seems like a deal.  (unless you are as jaded towards blatant marketing as I am.)
  • Availability bias – we tend to estimate the likelihood of some event by how readily examples come to mind.  Related to “availability cascade”: we hear about terrorist attacks often, but they are actually quite rare compared to other dangers, like car accidents.
  • Substitution – an answer-to-easy question (usually a query of our gut feeling about something) serves as answer to harder question.  “How much should I donate to save the whales?” –> “How much do I like whales?”  And then “System 1” (fast thinker) is good at translating between scales.
  • “Probability neglect” –  People are bad at dealing with small risks – either completely ignore or grossly overestimate.
  • Illusion of validity – nearly impossible to predict future outcomes in almost all areas, yet we are still confident of our own abilities or those of other seemingly-competent professionals.  Algorithms, stats on past similar situations are often better than intuition.
  • “Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck.”
  • “The outcome of a startup depends as much on the achievements of its competitors and on changes in the market as on its own efforts.”
  • Experts tend to ignore uncertainty: “Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients.”. Doctors had about 60% correct diagnoses compared to autopsies in one study.
  • Prospect theory – people are generally loss averse.  However, we are risk averse regarding possible gains (eg. lotteries), but become risk seeking when only bad options are available.  Rather than accept a loss, we often opt for a gamble with high probability of an even larger loss – too enticed by “it’s a long shot but if it works all our troubles will be over!”
  • Additional findings of prospect theory: we pay a premium for certainty of gains or certain avoidance of losses.
  • Kind of a different topic in the last section of the book, but interesting finding – we value our memory of something (or anticipated memory) more than the actual event.  Eg. excessive picture taking by tourists.  “Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”

Ok, my thoughts now.  The cure for emotion-based loss aversion is to think in and act on probabilities.  Avoid opportunities to second guess probabilistically rational decisions.  An example of sub-rational behavior in the face of probabilities:

Would you rather choose:

a) 95% chance to win $100

b) $80

The sure thing, b), seems like a good choice.  Even when you “know” that the expected value of a) is $95.  Our decisions are (sometimes) a sequence of such choices; when we always pick the sure thing we are short changing our selves over the long run.  “You win some, you lose some” is a winning attitude — when coupled with sound probabilities, of course.

And there in lies the real problem, I think.  We lean towards preferring the sure thing because in real-world decisions, we often don’t know the true probabilities of any but trivial or “toy” decisions.