Book Title: The Black Swan: The Impact of the highly improbable
ISBN: 978-1400063512
How strongly I recommend it: 2/5 Stars
The central premise of this book is that random, profound events, or ‘Black Swans’, occur frequently, and we are pretty bad at dealing with them.
The ideas discussed in this book are extremely valuable and they have far-reaching implications. I can relate to a lot of what Taleb has to say about human biases. However, this was by no means an easy (or enjoyable) read. Taleb has a tendency to go off on tangents, spends way too long emphasizing particular points and at times, the book reads like more an academic text with limited real-world applications.
In short, a good book but probably 150 pages longer than it needed to be.
Amazon Page Link: Click Here
How I discovered it? I often come across the term ‘Black Swan’ in a financial setting. I bought this book because I wanted to understand what that term actually meant from the person that coined it.
Who should read it? Anyone interested in probability, markets and portfolio construction.
SUMMARY
Taleb starts with outlining human biases that encourage us to neglect or misunderstand Black Swans. The fundamental problem is that the real world is too complicated to rely on solely past data, to make inferences about the future. As a result, we will continue to experience dramatic, random events (good or bad) that our models fail to predict.
Taleb then moves on to explain that the optimal way of handling Black Swans is to target uncapped returns with limited downside (‘lottery tickets’). In other words if you know you’ll experience a Black Swan at some point, you should set yourself up to benefit from it by having extremely high pay offs when and if it occurs. Some examples include living in a busy city because of the higher likelihood of a serendipitous encounter with someone else, choosing a career path where you can scale infinitely and barreling your portfolio between extremely risky and extremely safe investments. More on those ideas below.
KEY TAKE-AWAYS
- What is a ‘Black Swan’?
Black Swans are (1) Outliers (2) Profound and (3) Retrospectively predictable.
In simple terms, they are extreme events which have a big impact and appear to be predictable, but only after they occur. Classic examples could be 9/11, Black Monday and Covid19.
- The problem of Induction
The way we formulate knowledge is by inferring from past events e.g. I know burning is a painful sensation because I recall what it felt like to touch something hot.
In a similar fashion, most predictive models use the past, to predict the future. However, this process inherently ignores Black Swans because it assumes that the past is representative of the future, yet extreme events can drastically alter future outcomes.
- Just because it happened, doesn’t mean it was going to happen
The only way to determine causality is by conducting experiments (and observing the alternative outcomes). A fundamental failure of traditional schooling is that you’re generally taught to infer causality from backward looking history. Whilst it’s not always possible to consider alternative scenarios, when it is possible, we should focus on doing that.
- Key Human Biases to keep an eye out for
Taleb covers a number of human biases which weigh on our thinking. The one’s I could relate to the most, are below. I want to be more aware of these biases in my own thinking going forward.
Confirmation bias – applying greater weighting to data which confirms our original view
Narrative fallacy – creating stories with causal relationships to explain things (even when causality isn’t proven to exist)
Ludic Fallacy – thinking of relationships in linear terms e.g. Every 1 inch of rain increases road traffic by 10minutes
Silent evidence – drawing conclusions based on evidence which survived, only e.g. using odds of getting rich at a casino from the vantage point of winning gamblers only but excluding all those who started in the cohort.
Tunnelling – focusing too much on what we know but don’t want to focus on what we don’t know.
- We live in Extremistan
Taleb distinguishes between two types of worlds; (1) Mediocristan (2) Extremistan.
Mediocristan is a pretty simple world where the more information you have, the better you will be at predicting things. This is because the law of large numbers applies here i.e. the more data you collect, the more information you will have about the data set. Examples of data sets which fall into this category are things like the height of adults living in the UK.
Extremistan is more complicated. More information doesn’t mean better predictability. Outliers exist and have a huge impact on the average e.g. average wealth in the US. Most human made data sets fall into this bucket.
- Most real-life events don’t fit the Bell Curve
If you believe that daily stock market returns follow a bell curve (or normal distribution), then returns should hover around a mean and extreme events become infinitely less likely. Under this assumption, a day like Black Monday in 1987, where the stock market declined by 23% in a single day should only occur once in several billion years. Clearly that doesn’t hold up. The same applies for most real world events. (The scary thing is that normal distributions are assumed for many risk management models like VAR calculations to determine trading risks)
- Aim for Convex Returns!
Black Swans will happen and you don’t know how they will impact you (they can be positive or negative). In investing therefore, you should try to set yourself up for as much convexity in returns as possible. In simple terms, buy (cheap) options with limited downside and uncapped upside. The only thing you know is that something unpredictable and profound will happen (but you don’t know what that may be).
One way to do this in investing is by barreling your portfolio. Keep 80-85% in extremely safe, low volatility investments like Treasuries. Then put the rest in extremely risky, speculative investments (like call options). This is how Taleb made his fortune.
Another way to do this is to have a speculative portfolio but to insure it against a large drop (of say 15% of losses). This is something I need to think more about applying to my own portfolio.
FAVORITE QUOTES/STORIES
Turkey on Thanksgiving Day
Knowledge gained from observations has weakness when we live in Extremistan (see above) where extreme events take place. Take the example of a Turkey being fed everyday in the run up to Thanksgiving. The day prior to Thanksgiving, the Turkey will think humans are extremely generous and caring (to give it so much food). On Thanksgiving day, the Turkey will have realized this was not a good prediction!
What you know, cannot really hurt you. It’s the stuff you don’t know, that can really hurt.
Knowledge is built on falsification not on verification.
(Non) Scalability of returns
Taleb uses the example of choosing a profession to emphasize the power of having infinitely scaleable returns. If you choose to be a prostitute, you will be inherently limited in the amount of money you can earn, as this is directly a function of your time. However, if you decide to create a software business and then lease that out to companies, you have uncapped potential returns as this is not a function of your time (i.e. you can sell the same software to multiple businesses).
Happiness
Anecdote in the book but interesting experiment. Happiness depends on the instance of happy feelings and not on the severity of that feeling. In experiments, people who earned $100k/year over a period of 10 years, were happier than those who received $1m in one lumpsum payment in Year 1.
Think outside of the box
A good example of the Ludic Fallacy (people thinking inside of a box, using theoretical models of Mediocristan to predict outcomes in Extremistan, which then means they end up missing the complexities of real life problems).
Imagine you’re told that a die has been thrown 100 times and it has landed on heads 99 times and on tails once. Now you’re asked to predict the odds of landing on heads. Most people will say its 50%. The out of the box thinker will argue that the chance of landing on heads 99 times in a row is so small, that clearly this is not a fair die.
Evolutionary bias
A lot of our biases in thinking come from evolutionary traits. One such trait is to assign causality to events by creative stories linking things together, even if they are not actually causal. This is referred to as the Narrative Fallacy. Evolutionary biologists think we do this because it’s a lot easier to hold stories in our heads than just random data.