You're reading...

Wanderings

Don’t worry, the end is nigh

The next article on this site is to be about applying M Bricolage’s antifragility ideas to organisational risk management.  Not surprisingly I thought it was time to take a break for a little light amusement.  I’ve just discovered that the University of Cambridge is thinking about setting up a Centre for the Study of Existential Risk.  To my slight disappointment this is not the possibility that Team Sartre, Camus et al (inc M Bricolage?) will finally succumb to the combined forces of rationalists, positivists and other followers of the Enlightenment.  (That’s not a risk, that’s an issue.  Ed.)  Instead it’s the possibility that the human race will become extinct: no humans, no existence – geddit?

In contemplating the demise of our species you will not have missed (though unforgiveably, I did) that failure to last another 200 years means that we shall be prevented from celebrating the millennium of the august University.  That seems to be the main basis of an appeal for ideas and funds to set up the Centre.  I’m glad they didn’t miss the last bit of ‘the effect of uncertainty on objectives.’

The primary concern of the proposed centre seems to be the development of AI or, as the alumnus rag CAM thoughtfully puts it, Man vs Robot (or, as The Guardian even more thoughtfully puts it, Terminator studies).  It seems that we can expect to be overtaken by computers in the coming decades and “it is probably a mistake to think that any artificial intelligence, particularly one that arose accidentally, would be anything like us and would share our values, which are the product of millions of years of evolution.”  There is so much in this apercu to think about that it could only have been made by the Bertrand Russell Professor of  Philosophy.

While AI is top of the list, the Centre’s site also suggest biotechnology (and artificial life), nanotechnology and anthropogenic climate change (extreme effects only).  It’s interesting to compare that list to the global risks identified by the World Economic Forum (or, if you don’t want to go straight to the 12MB report, follow this link), headed up, for example, again by climate change combined with systemic economic failure.  I’ll be doing this in more detail in future, mainly as I’m organising a meeting in a couple of weeks on global risks.  However, although the WEF study has some questionable areas (alien life forms anyone?), it does not share what I see as the unworldliness of the Cambridge concerns.

We shall come back to this many times in the (near)  future I hope, but meanwhile put your hands in your pockets and let’s get the world’s first professor of existential risk working at a  “place where science and philosophy intersect.”  Whatever next?

Print Friendly