Many scientists believe that climate change will increase the likelihood of extreme weather events, but will we improve our ability to understand and react to these risks?
Nassim Taleb popularized the term “Black Swan” to refer to outlier events that could not have been predicted by drawing from past experience—alluding to the surprise of 17th century European explorers at their first sighting of a black swan in Australia. In his book, Taleb uses the term to refer to unexpected but defining events in human history such as World War I and September 11th, arguing that while no one foresaw these events, in hindsight, human nature causes us to invent explanations of why they were predictable. Since his book, the term has been used to refer to everything from the Deepwater Horizon Oil Spill to a design flaw in the iPhone 4 that causes dropped calls.
For many of these unexpected events, the case may not be that no one could imagine that they would happen, but that we may be predisposed to disregard the chances that they would happen.
Behavioral economics research tells us that individuals often dismiss low-probability events, neglecting the possibility of any risk unless it is above some subjective threshold, and that we tend to be overly optimistic about our chances of being the victim of a disaster (e.g., Camerer and Kunreuther 1989). These behaviors pose significant challenges in our ability to respond to the risks of high-impact events such as extreme natural disasters. For instance, if the potential damages of a drought are very large in a region, then significant investment to prevent the damages (or insure against them) may be warranted even if chances of the drought are relatively low. Risk-neutral individuals facing a 10% chance of suffering a $1M loss should still be willing to pay $100,000 to prevent the damages. But, if they heuristically ignore any risk with a chance less than 20%, then too few prevention measures will be taken.
Another group of researchers has recently considered whether our perception of risk is different based on our personal experience with the impacts. Kousky, Pratt, and Zeckhauser (2010) conjectured that when an event occurs that we have not experienced, we tend to overreact, believing that the threat is more likely than it actually is. Conversely, when we encounter events that we have experienced before, we tend to underestimate the risk of a recurrence and may not react when familiar dangers become more risky. These behaviors, if supported by more evidence, could potentially affect many aspects of climate change decision making.
One feature of Talib’s characterization of Black Swan events is that, in retrospect, we search for explanations for the events even when they don’t really exist. Whether this is true or not for the specific cases he suggests, some evidence supports that a similar effect may lead us to disregard the likelihood that a damaging event may reoccur in the future. In discussing the Gulf Oil Spill, Stephen Brown (2010) stated that “there can be a tendency within any industry to dismiss low-probability, catastrophic events such as the Deepwater Horizon disaster as a one-off occurrence, resulting from poor judgment or human error. That tendency could mean that industry standards would not tighten by as much as society deems appropriate. In such a case, society will want higher safety standards than the industry would adopt on its own accord.”
Understanding how we are predisposed to react to low-probability disasters is a necessary key to developing appropriate measures to prevent future damages from incidents like the Gulf Oil Spill as well as extreme weather events.