You're reading...

Articles

Complexity, cockroaches and building resilience

The near collapse of the financial system was fairly widely predicted though the political community is somewhat in denial about that. What was less widely foreseen was that it would happen in September 2008: it was a risk waiting to materialise as we risk geeks say.

Two authors with a gold-plated prediction record on this, essentially from a risk perspective, are Nassim Nicholas Taleb and Richard Bookstaber. What can we learn from them?

NNT has a high public profile as author of books such as Fooled by Randomness and The Black Swan. These books emphasise both the uncertainty that attends our world and our inability to adapt to this uncertainty. If we characterise unpredictability as randomness then we generally show ourselves to be poor in our interpretation of random events and in understanding the implications of a random future.

For example NNT develops the concept of lucky fools. Traders who win gain a reputation and attract custom; those who don’t get fired. There’s nothing clever about a successful trader; they just got lucky in a random and unpredictable market – see the multifractal post for more on this.  The same applies to our view of sport – a randomness classic. It’s widely thought that the results of football matches are a deterministic function of the players’ and managers’ ability, with maybe a few other factors thrown in. This creates an industry of meaningless statistics about how often City have lost at United when there’s an ‘r’ in the month and so on. And a run of losses leads to the ritual sacrifice of the manager.

NNT’s most celebrated construct is The Black Swan, a severe event which no-one foresaw but everyone can rationalise a postiori. The financial collapse is not a Black Swan (obviously, since it was predicted), but its occurrence can be attributed to the way we build complex systems that we don’t understand and overrate our ability to deal with them.

How can these insights be used to improve our politics? Interestingly NNT seems to have built a constructive relationship with David Cameron, the British Prime Minister. A BBC radio programme has documented this and you can see them cuddling up to each other on YouTube. It seems Cameron is interested in how the political process can adapt to uncertainty / unpredictability / randomness. What’s less clear is what impact NNT is having, if any. The main outcome seems to be a sort of small-is-beautiful localism. This is not making much headway in the current strained political climate and in any case will not be much use with some of the more global issues we face (see future posts on long term risks and the World Economic Forum’s risk analysis).

Coming back to our overconfidence in predicting the future, Bookstaber, in his book A Demon of Our Own Design, lays out the fundamental flaws in the market modelling used in the financial industry. He chronicles the rich history of disasters our ignorance has created from the engaging perspective of having been present at most of them. And the first, memorable, words of his 2008 foreword to a book published in 2007 are, “what a mess!”

At the core of both accounts is the failure of so-called bell curves to model the distribution of events adequately, and the failure of markets to behave in accordance with their assumed perfection. Each serious episode is exacerbated by a liquidity crisis.

The wider lesson Bookstaber draws concerns the risks associated with systems which are (a) complex and (b) tightly coupled, that is one part affects the other in an immediate way that is hard to interrupt. Such systems are accidents waiting to happen to such an extent that they have been characterised as ‘normal accidents’: you’re going to get them if you insist on putting this kind of system together. It’s supposed to apply to nuclear reactors as much as the financial world – an inference which I think is dubious as nuclear reactors are nothing like as complex as banks, despite what bankers seem to think.  Normal accidents are a concept originally developed by Charles Perrow.

What’s more, any attempt to regulate such a system is doomed to failure because the regulator becomes part of the system and just makes it more complex (see for example my BBC blog).

Here’s another example which I owe to Tim Harford, an economic journalist who addressed the IRM forum in Manchester last April. Imagine a small bank in Oklahoma in 2008. The bank is very prudent and has kept well away from sub-prime based securities so its risk is small and its credit rating decent. But it can improve its credit rating by insuring the (small) credit default risk with a triple-A insurer, AIG say. You can guess the rest. With its gleaming new credit rating it can decrease its capital reserves and expand its business. Until September 2008, that is, when the regulator gets on the phone and points out that AIG has collapsed and with it the bank’s credit rating. Now it needs to raise some capital and sell off some of its securities. Unfortunately the bank is not the only one taking this call: liquidity and prices hit rock bottom and everyone, however prudent they have been, is in trouble.

So ratchetting up the capital is not the answer (though Harford seemed to think it was in spite of this story). Bookstaber thinks you have to de-complexify and turns to the humble cockroach. This is a creature which has survived for millions of years by a very simple means: it runs away from the prevailing air currents. This contrasts with short-lived species which develop very specialised niches but go extinct when something dramatic happens.

These analyses, bolstered by the credibility of their authors having been proved right, provide a rich vein of suggestions for how we can best try to cope with our uncertain future.

Firstly we need to highlight the lessons of NNT’s black swan thinking for organisational risk management.  Some people find it difficult looking beyond NNT’s unpleasing persona to seeing what they can learn.  This will certainly include the role of probabilities.

More importantly we need to think about dealing with complexity and how we can become cockroaches.  Business continuity is ahead of us in this with their emphasis on the resilient organisation.  But to what extent is the resilient organisation actually effective against black swans or, for that matter, anything which lies outside our experience.  And to what extent are agility or adaptability better responses than cockroach simplicity or resilience?  What can we learn from highly reliable systems such as aircraft carriers?

Most importantly of all, how do these concepts contribute to our risk culture: what we do when when the processes – fragile as they are – have failed?  This will be a key element in meeting the clouds of vagueness goal of improving organisational risk management.

Print Friendly