Because I live in New York city, I often have conversations with people who find my political views baffling. Many of my interlocutors casually assert that anyone who doesn’t believe that inequality is the foremost problem facing the United States or who believes that monetary policy failures were a far more significant contributor to our current economic woes than, say, the repeal of Glass-Steagall is obviously lying or crazy or both. When I gently suggest that monocausal explanations don’t make much sense in a complex world, we usually shift to another subject, and I am always happy to do so. This narrative mode of thinking isn’t limited to people with whom I mostly disagree: it is equally pervasive among people who share many of my political views. Indeed, there are narratives that shape my own way of thinking, often, I assume, to my detriment.
Reid Hastie, a professor at the University of Chicago’s Booth School of Business, offers an elegant hypothesis as to what might be going on:
We know there was no single cause or event that set in motion the crisis and that the truth is complex and multicausal. So why do we keep seeking the easy answers? It may be that we are hard-wired to do so.
The human brain is designed to support two modes of thought: visual and narrative. These forms of thinking are universal across human societies throughout history, develop reliably early in individuals’ lives, and are associated with specialized regions of the brain. What isn’t universal or natural is the kind of highly structured cognitive processes that underlie logical and mathematical thinking — the kinds of analysis that produce the most remarkable human cultural products, especially scientific achievements such as interplanetary travel, electronic devices and genetic engineering. They also allow the types of analysis needed to design effective economic policies and business strategies.
As Hastie explains, the benefits of visual and narrative thinking are significant. Yet there are nevertheless many pitfalls of narrative thinking:
[N]arratives give us a false sense of understanding and control, when they are really mere redescriptions of selected subparts of the events to which they refer. Once we have a good narrative summary, we have the illusion that we could have intervened and controlled outcomes, or could have predicted what in hindsight seems to be an obvious outcome. But, unlike valid causal explanations that support informative forecasts and suggest ways to change events further down the causal stream, narratives lack these basic properties of true causal explanations.
Narratives also tend to be dominated by a few major actors, and faux explanatory power is derived from simplistic interpretations of those actors’ characters and motives. And the universal human illusion that consciously accessible thoughts are in the driver’s seat and controlling our own actions means that the salient actors in a narrative we want to understand are attributed information and incentives to a greater degree than is warranted.
Hastie then explains the results of his research (with Benjamin Rottman) on causal reasoning. I found this passage particularly interesting:
For example, if an outcome, such as a forest fire, is most likely produced by any one of several independent causes, then when we are sure that one cause is arson, we should rationally reduce our belief that it was caused by something else, such as lightning or a careless camper.
It appears, however, that humans are a bit too eager to “discount,” and discounting is an error, when conjunctions of causes are the correct explanation, as they usually are. This habit can get us into trouble when we’re reasoning about complex multiply caused events. Perhaps the single-factor explanations for the recent financial recession reflect a bias produced by over-discounting.
The punchline to the joke here is that because Hastie’s findings resonate with my experience, I am particularly keen on embracing his findings — which, of course, reflects a particular kind of cognitive bias.