Why didn’t we see COVID-19 coming?

Richard Logan asks how we deal with the problem of being surprised by 'black swan' events.

Is COVID-19 a ‘black swan’ event? Such an event being one that lies outside the realm of regular expectations, is rare, has extreme impact and is retrospectively predictable.

In a complex interconnected world, there is no shortage of these unexpected black swan events. Recent examples include: the 2019 Whakaari/White Island explosion, the 2019 Christchurch mosque shootings, the 2019/2020 Australian bush fires, Auckland’s potential of a one-in-a-thousand-year drought and even the 2010 Pike River coal mine disaster.

It is normal and natural to consciously/unconsciously simplify complexity. This is called bounded rationality. One of the many simplifications we make is by using inductive rationality. This is where you make decisions based on the evidence you have.

Most of the time, this works very well or well enough, but occasionally, because of the simplifications, the potential risks are understated and we may not be aware of that understatement.

This leads to various inductive cognitive biases, such as optimism, confirmation or hindsight biases, and occasionally to a surprise black swan event.

Each of the examples above was a surprise in some way for the relevant decision-makers, especially over the timing and impact of the event.

COVID-19 is a typical black swan event, as it was a surprise to most people as to its impact, not just in fatalities, but on the economy and to social life. Who would have predicted five months ago that 96 percent of commercial aircraft would be grounded worldwide or our entire tourism sector closed down or world oil prices could go into negative territory?

It is unclear when we can expect an improvement to the current situation. This highlights just how uncertain our future inherently is.

Each of my black swan examples relates to an event that has multiple layers and interconnected parts that make it more than just one thing. For example, the explosion at Pike River was more than just a mine accident with 29 deaths, but also a failure of Pike’s organisation (for example, planning, operations, monitoring and loss of all $300 million or more funds invested by the 5000 shareholders) and for the economic development aspirations of the West Coast.

So how do we deal with the problem of being surprised by black swan events or, as philosophers call this problem, the ‘problem of induction’? One possible answer comes from the work by Canadian-American Philip Tetlock.

From the 1980s, Tetlock studied large numbers of public policy forecasts in the United States. He and his team have now collected more than a million predictions from 25,000 forecasters.

From this, Tetlock developed a fox/hedgehog cognitive thinking model to explain why those he called foxes were much better forecasters than those he described as hedgehogs. He loosely based this fox/hedgehog metaphor on an earlier essay of the same name by Isaiah Berlin, where he describes foxes as knowing many little things, drawn from many traditions, compared with hedgehogs, who know one big thing, within one tradition.

One of the key differences between foxes and hedgehogs under the Tetlock model is hedgehogs tend to over-simplify complexity, which makes them over-confident and more often wrong. Foxes are complexity-tolerant and aware of what they don’t know, making them timid and more cautious than the bold and confident hedgehog forecasters. Their forecasts are more often right.

When dealing with complexity and uncertainty, there are six key attitudes that can be derived from Tetlock’s work that are relevant to decision-makers. These are:

1. Attitude to evidence/knowledge

Decision-makers need to develop a good knowledge base and overview, which they interpret empirically, through observations rather than just theory. Avoid compelling top-down narratives that are neater and tidier than reality.

2. Attitude to changing circumstances or new information

Decision-makers need to be active Bayesian belief updaters, which means they have an open mind and are keen to learn. Avoid belief perseverance when the evidence changes.

3. Attitude to diversity of thinking and approaches

Decision-makers need to apply multiple lenses, by using multiple tools, models, heuristics or perspectives. Avoid the inside view as the single lens.

4. Attitude to acknowledging mistakes/blame

Decision-makers need to be self-critical, have doubts and admit they can be wrong. They need to apply this attitude to their management style, which encourages a positive growth and learning environment. Avoid having a blame culture.

5. Attitude to uncertainty and complexity

Decision-makers need to be comfortable with their ignorance and that they need to expect the unexpected due to dynamic complex systems with lots of interconnected parts and layers that exceed anyone’s ability to completely understand. Avoid the desire for simple and certain answers, without complications, as that will make you confident but not competent.

6. Attitude to risk/probability

Decision-makers need to be cautious and probabilistic. Avoid unnecessary simplifications, such as equating low probability with no probability, when dealing with high-impact black swan events.

Tetlock effectively found those with these six attitudes suffered less from various inductive cognitive biases and were less surprised by the unexpected black swan events, which was not the case for the hedgehogs, who are especially prone to being blindsided by black swan events.

It is okay to simplify complexity, but it can be dangerous to oversimplify complexity.

Richard Logan is a PhD candidate in Wellington School of Business and Government at Te Herenga Waka—Victoria University of Wellington.

Read the original article on Newsroom.