Black swan theory

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The black swan theory or theory of black swan events is a metaphor that describes an event that comes as a surprise, has a major effect, and is often inappropriately rationalized after the fact with the benefit of hindsight. The term is based on an ancient saying which presumed black swans did not exist, but the saying was rewritten after black swans were discovered in the wild.

The theory was developed by Nassim Nicholas Taleb to explain:

  1. The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology.
  2. The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities).
  3. The psychological biases which blind people, both individually and collectively, to uncertainty and to a rare event's massive role in historical affairs.

Unlike the earlier and broader "black swan problem" in philosophy (i.e. the problem of induction), Taleb's "black swan theory" refers only to unexpected events of large magnitude and consequence and their dominant role in history. Such events, considered extreme outliers, collectively play vastly larger roles than regular occurrences.[1]:xxi More technically, in the scientific monograph 'Silent Risk',[2] Taleb mathematically defines the black swan problem as "stemming from the use of degenerate metaprobability".[2]


The phrase "black swan" derives from a Latin expression; its oldest known occurrence is the poet Juvenal's characterization of something being "rara avis in terris nigroque simillima cygno" ("a rare bird in the lands and very much like a black swan").[3]:165 When the phrase was coined, the black swan was presumed not to exist. The importance of the metaphor lies in its analogy to the fragility of any system of thought. A set of conclusions is potentially undone once any of its fundamental postulates is disproved. In this case, the observation of a single black swan would be the undoing of the logic of any system of thought, as well as any reasoning that followed from that underlying logic.

Juvenal's phrase was a common expression in 16th century London as a statement of impossibility. The London expression derives from the Old World presumption that all swans must be white because all historical records of swans reported that they had white feathers.[4] In that context, a black swan was impossible or at least nonexistent.

However, in 1697, Dutch explorers led by Willem de Vlamingh became the first Europeans to see black swans, in Western Australia.[5] The term subsequently metamorphosed to connote the idea that a perceived impossibility might later be disproven. Taleb notes that in the 19th century, John Stuart Mill used the black swan logical fallacy as a new term to identify falsification.[6]

Black swan events were discussed by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness, which concerned financial events. His 2007 book The Black Swan extended the metaphor to events outside of financial markets. Taleb regards almost all major scientific discoveries, historical events, and artistic accomplishments as "black swans"—undirected and unpredicted. He gives the rise of the Internet, the personal computer, World War I, dissolution of the Soviet Union, and the September 2001 attacks as examples of black swan events.[1]:prologue

Taleb asserts:[7]

What we call here a Black Swan (and capitalize it) is an event with the following three attributes.

First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme 'impact'. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.

I stop and summarize the triplet: rarity, extreme 'impact', and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.

Identifying a black swan event[edit]

Based on the author's criteria:

  1. The event is a surprise (to the observer).
  2. The event has a major effect.
  3. After the first recorded instance of the event, it is rationalized by hindsight, as if it could have been expected; that is, the relevant data were available but unaccounted for in risk mitigation programs. The same is true for the personal perception by individuals.

Coping with black swan events[edit]

The main idea in Taleb's book is not to predict black swan events, but to build robustness against negative ones that occur and to be able to exploit positive ones. Taleb contends that banks and trading firms are very vulnerable to hazardous black swan events and are exposed to unpredictable losses. On the subject of business in particular, Taleb is highly critical of the widespread use of the normal distribution model as the basis for calculating financial risk, calling it a Great Intellectual Fraud.

In the second edition of The Black Swan, Taleb provides "Ten Principles for a Black-Swan-Robust Society".[1]:374–78[8]

Taleb states that a black swan event depends on the observer. For example, what may be a black swan surprise for a turkey is not a black swan surprise to its butcher; hence the objective should be to "avoid being the turkey" by identifying areas of vulnerability in order to "turn the Black Swans white".[9]

Epistemological approach[edit]

Taleb's black swan is different from the earlier philosophical versions of the problem, specifically in epistemology, as it concerns a phenomenon with specific empirical and statistical properties which he calls, "the fourth quadrant".[10]

Taleb's problem is about epistemic limitations in some parts of the areas covered in decision making. These limitations are twofold: philosophical (mathematical) and empirical (human known epistemic biases). The philosophical problem is about the decrease in knowledge when it comes to rare events as these are not visible in past samples and therefore require a strong a priori, or an extrapolating theory; accordingly predictions of events depend more and more on theories when their probability is small. In the fourth quadrant, knowledge is uncertain and consequences are large, requiring more robustness.[citation needed]

According to Taleb,[11] thinkers who came before him who dealt with the notion of the improbable, such as Hume, Mill, and Popper focused on the problem of induction in logic, specifically, that of drawing general conclusions from specific observations. The central and unique attribute of Taleb's black swan event is its high profile. His claim is that almost all consequential events in history come from the unexpected — yet humans later convince themselves that these events are explainable in hindsight.

One problem, labeled the ludic fallacy by Taleb, is the belief that the unstructured randomness found in life resembles the structured randomness found in games. This stems from the assumption that the unexpected may be predicted by extrapolating from variations in statistics based on past observations, especially when these statistics are presumed to represent samples from a normal distribution. These concerns often are highly relevant in financial markets, where major players sometimes assume normal distributions when using value at risk models, although market returns typically have fat tail distributions.[12]

Taleb said "I don't particularly care about the usual. If you want to get an idea of a friend's temperament, ethics, and personal elegance, you need to look at him under the tests of severe circumstances, not under the regular rosy glow of daily life. Can you assess the danger a criminal poses by examining only what he does on an ordinary day? Can we understand health without considering wild diseases and epidemics? Indeed the normal is often irrelevant. Almost everything in social life is produced by rare but consequential shocks and jumps; all the while almost everything studied about social life focuses on the 'normal,' particularly with 'bell curve' methods of inference that tell you close to nothing. Why? Because the bell curve ignores large deviations, cannot handle them, yet makes us confident that we have tamed uncertainty. Its nickname in this book is GIF, Great Intellectual Fraud."

More generally, decision theory, which is based on a fixed universe or a model of possible outcomes, ignores and minimizes the effect of events that are "outside the model". For instance, a simple model of daily stock market returns may include extreme moves such as Black Monday (1987), but might not model the breakdown of markets following the 9/11 attacks. A fixed model considers the "known unknowns", but ignores the "unknown unknowns", made famous by a statement of Donald Rumsfeld.[13] The term "unknown unknowns" appeared in a 1982 New Yorker article on the aerospace industry, which cites the example of metal fatigue, the cause of crashes in Comet airliners in the 1950s.[14]

Taleb notes that other distributions are not usable with precision, but often are more descriptive, such as the fractal, power law, or scalable distributions and that awareness of these might help to temper expectations.[15]

Beyond this, he emphasizes that many events simply are without precedent, undercutting the basis of this type of reasoning altogether.

Taleb also argues for the use of counterfactual reasoning when considering risk.[7]:p. xvii[16]

See also[edit]


  1. ^ a b c Taleb, Nassim Nicholas (2010) [2007]. The Black Swan: the impact of the highly improbable (2nd ed.). London: Penguin. ISBN 978-0-14103459-1. Retrieved 23 May 2012. 
  2. ^ a b Taleb, Nassim Nicholas (2015), Doing Statistics Under Fat Tails: The Program, retrieved 20 January 2016 
  3. ^ Puhvel, Jaan (Summer 1984). "The Origin of Etruscan tusna ("Swan")". The American Journal of Philology. Johns Hopkins University Press. 105 (2): 209–212. doi:10.2307/294875. JSTOR 294875. JSTOR 294875
  4. ^ Taleb, Nassim Nicholas. "Opacity". Fooled by randomness. Retrieved 20 January 2016. 
  5. ^ "Black Swan Unique to Western Australia", Parliament, AU: Curriculum, archived from the original on 2009-09-13 .
  6. ^ Hammond, Peter (October 2009), "Adapting to the entirely unpredictable: black swans, fat tails, aberrant events, and hubristic models", WERI Bulletin, UK: Warwick (1), retrieved 20 January 2016 
  7. ^ a b Taleb, Nassim Nicholas (22 April 2007). "The Black Swan: Chapter 1: The Impact of the Highly Improbable". The New York Times. Retrieved 20 January 2016. 
  8. ^ Taleb, Nassim Nicholas (7 April 2009), Ten Principles for a Black Swan Robust World (PDF), Fooled by randomness, retrieved 20 January 2016 
  9. ^ Webb, Allen (December 2008). "Taking improbable events seriously: An interview with the author of The Black Swan (Corporate Finance)" (Interview; PDF). McKinsey Quarterly. McKinsey. p. 3. Retrieved 23 May 2012. Taleb: In fact, I tried in The Black Swan to turn a lot of black swans white! That’s why I kept going on and on against financial theories, financial-risk managers, and people who do quantitative finance. 
  10. ^ Taleb, Nassim Nicholas (September 2008), The Fourth Quadrant: A Map of the Limits of Statistics, Third Culture, The Edge Foundation, retrieved 23 May 2012 
  11. ^ Taleb, Nassim Nicholas (April 2007). The Black Swan: The Impact of the Highly Improbable (1st ed.). London: Penguin. p. 400. ISBN 1-84614045-5. Retrieved 23 May 2012. 
  12. ^ Trevir Nath, "Fat Tail Risk: What It Means and Why You Should Be Aware Of It", NASDAQ, 2015
  13. ^ DoD News Briefing - Secretary Rumsfeld and Gen. Myer, February 12, 2002 11:30 AM EDT Archived 3 September 2014 at the Wayback Machine.
  14. ^ Newhouse, J. (14 June 1982), "A reporter at large: a sporty game: i-betting the company", The New Yorker, pp. 48–105 
  15. ^ Gelman, Andrew (April 2007). "Nassim Taleb's "The Black Swan"". Statistical Modeling, Causal Inference, and Social Science. Columbia University. Retrieved 23 May 2012. 
  16. ^ Gangahar, Anuj (16 April 2008). "Market Risk: Mispriced risk tests market faith in a prized formula". The Financial Times. New York. Archived from the original on 20 April 2008. Retrieved 23 May 2012. 


External links[edit]