“For every problem, there is a solution that is simple, elegant, and wrong.”
-H.L. Mencken

The quote above from the man known as “The Sage of Balitmore” is an apt description for what we now call the narrative fallacy. This term as been most widely popularized by Nassim Taleb in his bestselling book, “The Black Swan : The Impact of the Highly Improbable.”

In conventional terms, a narrative fallacy is the need to put information into a story, or narrative, to explain the unknown. In effect, by creating an explanation, we delude ourselves into believing we understand what we are explaining.

We all like to simplify and summarize complex events, it makes them understandable. In fact, it is ingrained in our natural processes to look at a series of facts, events, or words and to, via mental short cuts, simplify them. Take the series of words below as an example:

A bird in the
The hand is worth
Two in the bush

Is there anything noteworthy about the sequence of words? Take another look. There is actually a typo in the way of “A bird in THE THE hand . . .” This is an example that Taleb uses in his book, so credit is due where credit is due, but it is a very simple manifestation of a discovery made by Australian brain scientist Alan Snyder.

Snyder is famous for his study of “savant syndrome”. This is a situation in which people with severe mental disorders can exhibit incredible talents in various esoteric fields, such as music, art, and mathematics. Snyder theorizes that “savant” type abilities reside in all of us, but because of how our brain processes information most people, with a normally functioning brain, are unable to tap into it.

In terms of the series of words above, Snyder discovered that “if you inhibit the left hemisphere of a right-handed person (more technically, by directing low-frequency magnetic pulses into the left frontotemporal lobes), you lower the rate of error in reading [the series above].” The brain naturally looks at the series above and imposes a theme or understanding and, in fact, glazes over the details. We call this interpretation. It is a mental short cut that all humans use in varying degrees. Ironically, by limiting part of our brain, we are more effective in seeing things as they actually are without prejudice.

In highly complex systems, such as investing in the global markets, the creation of narrative fallacies becomes even more likely. The most poignant examples of narrative fallacy are often articulated by the 24/7 business news media, the CNBCs of the world. They are by their very nature constantly reacting to global market events and are required to come up with interpretations of events on the fly. Rarely are these interpretations founded on anything other than mental short cuts, but they share one attribute of all narrative fallacies, plausibility. These “plausible” explanations are then adopted by investors who watch CNBC as part of their process. (Incidentally, if anyone can find me a trading floor in America that does not tune into CNBC I would be shocked. This point seems to verify the broad spread and unknowing acceptance of narrative fallacy).

This tendency to impose a narrative, or causality, leads to what Taleb calls, “dimension reduction”. As we impose an interpretation on a series of facts or events, we unconsciously rule out, or dramatically underweight other explanations. In terms of risk management, which requires a healthy dose of scenario analysis, this can be a fatal flaw. Undoubtedly many of the private equity and long only levered investors of 2006 and 2007 modeled their investments based on future projections that incorporated scenarios that arbitrarily included, or perhaps not so arbitrarily in the narrative fallacies that were their investment memos, limited weightings to more extreme scenarios. Any scenario is, of course, possible if we ignore the facts or probabilities.

Information is costly to obtain, process, and manage. As information expands, these costs increase almost exponentially and the likelihood of false interpretations also expands. An increase of variables in any scenario can actually be mathematically quantified by Kolmogorov Complexity. In very, very basic terms, the complexity of a string of data is related to the length of the string of data.

As investors, we operate in a world that is highly random, complex, and has an almost infinite amount of facts (or at least more facts than we can adequately fit into our brains), so how do we avoid falling into that narrative fallacy trap? The answer, quite simply, is to ignore the initial explanation that our brain, or the talking heads on CNBC offer, and call it for what it is, a mental shortcut that is likely erroneous.

The solution, rather, is to think. Step back, test the facts, find more facts, use scenario analysis, and then make a decision. Write down your process and thesis in journal, as if you were a scientist conducting an experiment, and use those notes to verify whether your decision making is based on a valid process.

Daryl G. Jones
Managing Director