One of the basic integral elements of crisis is surprise. It is that undefined factor that determines the development of an issue into a severe crisis. The prediction of a forthcoming destruction is easy as long as we accept that eventually there will be an internally or externally driven incident that will or will not have an impact on an organization. This deterministic approach should be challenged. Eventually we’ll all die one day but that doesn’t mean we will stop believing to the value of life.
So, how could we challenge the reality of an underlying undetermined factor? Someone might say we plan carefully and get ready for any eventuality. This is one of the basics in crisis management but we all know that organizations have failed to deal with incidents despite their well-prepared plans. One of the reasons for such failure is the incapability to identify warning signals of an imminent crisis. Most of the times we identify these signals after the crisis has occurred along the lines of a lessons learned process and then we ask ourselves «why didn’t we manage to see these signals before the incident?».
People tend to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment. Put simply, people miss the whole picture of an issue because they do not take into account alternative explanations for the evidence they see. These are the cognitive biases which affect belief formation, business and economic decisions, and human behavior in general.
To give you example, one of the cognitive biases that applies in the finance market is the ostrich effect. In behavioral economics, the ostrich effect is the avoidance of apparently risky financial situations by pretending they do not exist and we will refer to the Madoff scandal to show how the biases affect the decision making process.
The Madoff case is a good example that proves the ostrich effect. Harry Markopolos, an American former securities industry executive, discovered evidence suggesting that Bernard Madoff’s wealth management business was actually a massive Ponzi scheme. In 2000, 2001, and 2005, Markopolos alerted the U.S. Securities and Exchange Commission (SEC) of the fraud, supplying supporting documents, but each time, the SEC ignored him or only gave his evidence a cursory investigation. Madoff was finally uncovered as a fraud in December 2008.
Markopolos proved using math that Madoff could not really produce 1% to 2% returns every month, in positive territory 96% of the time, producing a 45-degree curve of profit – with no volatility.
Even if the SEC executives did not listen to Markopolos, I wonder how really smart people in the finance sector failed to see Madoff’s Ponzi scheme. The warning signals were all around them but they couldn’t accept that one of the most respectable establishments in Wall Street could be a fraud. Markopolos’s book on the Madoff Ponzi scheme titled ‘No One Would Listen: A True Financial Thriller’ is a great read and I totally recommend it.
The cognitive biases are an integral part of human behavior and as such they affect decision making in crisis management however it’s up to us whether these biases will play a bigger or smaller role in initially identifying the risks and then dealing with the issues or crisis.
The opinions expressed in this article are those of the author, and they do not reflect in any way those of his various affiliations.