An Inquiry Into the Nature and Causes of Predictable Surprises and How To Avoid Them
Updated: Dec 5, 2020
Some have incorrectly called the Coronavirus outbreak a Black Swan event i.e. an event that was “hard-to-predict, rare and beyond the realm of normal expectations in history, science, finance, and technology” (Nissim Taleb). But this event was predicted. Simulations of such an event were even played by the World Economic Forum (WEF) which concluded “ the world was unprepared” for such an event. Details
The WEF had been warning about “unpreparedness” since 2015, and advised, “the weakness of basic preparedness in individual countries is an important obstacle to pandemic responses”.
The outbreak is just another example of a “predictable surprise” (Bazerman and Watkins). “Problems that at least some people are aware of, are getting worse over time, and are likely to explode into a crisis eventually, but are not prioritized by key decision-makers or have not elicited a response fast enough to prevent severe damage”.
The problems behind “predictable surprises” tend to require a significant investment in the near term that will not pay off until later. This could involve changes to established organization culture and/or changes that competing interests do not benefit from. These characteristics result in such problems being wilfully ignored as Margaret Heffernan describes in her book Wilful Blindness.
Thankfully predictable surprises on the scale of the Coronavirus are rare, but the 9/11 terrorist attacks and the BP Horizon disaster are other examples of other large scale events that were predicted.
However, those examples are just the tip of the iceberg. In fact almost every business or organisation failure is followed by a postmortem that reveals the ‘surprises’ that were actually predicted, but ignored. Read the Full Article
A better understanding of Critical Systems Thinking and Practice could help prevent many of these problems or, at the very least, help improve responses and mitigate the damage.