How Not to Go Extinct?

Michael O'Connor discusses threats to global public health. 

Since 2018, the World Health Organization’s list of diseases that pose the greatest threat to global public health has ominously included ‘Disease X’, a placeholder term for an unknown pathogen with the potential to cause an international epidemic. Now, in 2020, disease X has struck in the form of COVID-19. COVID-19 may well be the first disease X, but it certainly won’t be the last. The next one might emerge on a farm in rural Nigeria or a research lab somewhere in the United States. Only one thing is certain: it will be devastating.

Disease X belongs to a class of events that pose a global catastrophic risk - i.e. that threaten to cause worldwide disruption of economic and social order as well as large-scale loss of life. At one extreme are so-called existential risks, which are defined by philosopher Nick Bostrom as those risks that threaten the extinction of the entire human race or the severe curtailment of its future development. The most pertinent of these risks is climate change, the Damoclean sword that hangs over our everyday existence. Other such risks include the development of malign artificial intelligence, the emergence of an even deadlier pathogen than COVID-19 by accident or design, nuclear war, climate change, or a meteor strike.

It’s clear to any reasonable person that we should do everything we can to address climate change. It’s happening now and its impact will be enormous. But we also need to focus more attention on other existential risks. Although these are often low probability events, their potential impact is vast. As the philosopher Derek Parfit argues in his book Reasons and Persons, the extinction of the human race is a much more terrible outcome than even the death of, say, 99% of the human population because it would prevent the birth of billions of future people. A huge number of potential humans would never come to exist; all the works of art and knowledge that they might create would never come to be.

To address existential risks, we need to work to identify potential risks; quantify their potential impact and probability of occurrence; undertake measures to prevent them; and implement strategies to mitigate their impact. Such work is already under way. Since the 1980s, a number of centres have been set up to study existential risks, such as the Centre for the Study of Existential Risk at the University of Cambridge. Preventative measures are also in the works. In July 2020, NASA will launch a spacecraft on a collision course with the asteroid 65803 Didymos. The so-called Double Asteroid Redirection Test will establish whether or not we would be able to re-direct an earth-bound asteroid and thus prevent humans from meeting the grisly end of the dinosaurs. Meanwhile, various organizations are in the process of creating guidelines in areas of research liable to produce catastrophic events, such as work on AI or synthetic biology.

A number of individuals and organizations warned of the risk of a global pandemic in the years leading up to 2020. Bill Gates has played the role of Cassandra on a number of occasions over the past decade; the UK government’s 2019 National Security Risk Assessment advised stockpiling PPE, face masks and other medical resources to cope with a potential pandemic. If we had prepared better for the present pandemic, as a country and as a planet, then the death toll might have been much lower.

We need to undertake a global effort to identify existential risks and work out how to mitigate or prevent them. In an increasingly fragmented world, it might seem incorrigible optimism to suppose that we can band together to address existential risks. But if we do not plan for and mitigate existential risks, then we will find ourselves just as unprepared when Disease X, or Event X, strikes again. Hindsight is a wonderful thing. Foresight is even better.

Michael is a graduate student in philosophy who enjoys food, quizzing and campaigning for the Labour Party. 

He tweets at @Michael85994220.

Do you like this post?