Worse than COVID-19: More can and must be done to prevent the greatest threats to human survival

Matt Boyd, Nick Wilson

Growing mushrooms: just one way to protect humanity in a period of reduced sunlight

“Governments routinely ignore seemingly far-out risks. Rocked by a global pandemic, they need to up their game” (The Economist, 27 June 2020).

It is not clear whether risks that threaten human extinction have received appropriate attention at the level of international governance. We systematically searched the documents of the UN Digital Library and concluded that they have not. Our results and commentary were recently published in the international journal Risk Analysis. In this blog we give an overview of existential risks, our findings and possible international and national solutions.

Existential risks

The COVID-19 pandemic is clearly a very serious global disaster, but there are threats much more dire than COVID-19. These include threats that have a long history of international attention, such as a US-Russia nuclear war, and also those that are new or less familiar such as technological risks including geoengineering or synthetic biology. In the extreme some large-scale global catastrophes could threaten human extinction. The following is a list of some plausible threats to human survival:

  1. Nuclear winter (the sun is obscured by soot from burning cities following a nuclear war)
  2. Artificial intelligence (AI; machines are developed in the future with goals that are not aligned to those of humanity and wreak havoc)
  3. Synthetic biology (engineering principles are used to produce dangerous biotechnology, including devastating bioweapons)
  4. Geoengineering (modification of the atmosphere or oceans to mitigate climate change goes wrong)
  5. Nanotechnology (nano-scale engineering creates a runaway process that degrades the environment)
  6. Asteroid/comet impacts (a large object(s) collides with the Earth causing mass extinction as with the dinosaurs)
  7. Supervolcanic eruption (massive volcanic eruption causes a decades long drop in Earth’s temperature)
  8. Experimental physics disaster (high-energy physics experiment creates a devastating physical process such as a black hole)

This list of existential risks is not exhaustive and others include: risks of catastrophe due to biodiversity loss, ecosystem collapse, societal collapse, solar storm, a flood basalt event, a close supernova/gamma-ray burst/magnetar explosion, or even attracting the attention of harmful extra-terrestrial intelligence. There are also as yet unknown risks.

Many of these threats are not stand-alone threats but could combine with other risks. We can imagine scenarios where AI is deployed to aid the development of dangerous biotechnology, or where a pandemic emerges in a period of low health security following a nuclear war or comet impact.

Active mitigation of extinction threats is justified by the perspective of long-termism, which is grounded in the vast expected value of future human lives and the common desire to preserve aspects of the “human project,” such as our intergenerational cultural, scientific, and technological endeavours.

Results from our analysis of the UN Digital Library

We examined the UN Digital Library for evidence of any general international discussion about risks that threaten human extinction and also for evidence of discussion of the eight specific existential threats listed above.

Our search for 22 synonyms of existential risk in the UN Digital Library returned 97 relevant mentions. Over two-thirds (69%) of these pertained to nuclear war. Climate change was the threat mentioned 24 times, however in these cases the context was often an existential threat to island states rather than humanity as a whole. There were a handful of references to existential threats in the context of general disarmament or weapons of mass destruction.

Strikingly, searches of the UN Digital Library revealed few if any other categories of existential risk raised in a manner that made the threat of human extinction salient.

UN documents that explicitly discuss human extinction have a limited focus

On the basis of the keyword search it appears that the UN has a long history of addressing the threat of nuclear war and has engaged with the threat from comets and near-Earth objects through the Committee for the Peaceful Uses of Outer Space. These results seem to indicate a lack of attention paid to most existential risks.

Why is there little attention to existential risks?

There are clearly competing demands on national and international policymakers. Immediate threats such as regional conflict, trade, poverty, local health and education issues, as well as environmental concerns, weigh heavily and cannot be ignored. Yet the COVID-19 pandemic has demonstrated the economic and human devastation that arises if low probability or infrequent but catastrophic hazards are ignored.

Mitigation of existential risk is a global public good. We’ve seen with climate change how large-scale cooperation is needed to counteract the tendency for markets to undersupply such goods.

It is also the case that international policymakers are not well acquainted with considering human extinction. Theory and frameworks may be necessary to facilitate the right discussions. However, classification frameworks for severe global catastrophic risk scenarios now exist and can aid in exploring the interplay between many interacting critical systems.

Us humans are also subject to psychological biases that may prevent action. Time discounting means that we tend to prefer value now to in the future. This is unfortunate for future people and the intergenerational nature of the benefits of existential risk mitigation. Future people perhaps stand to benefit most, yet they lack a voice in present policy decisions. This needs to change and the rights of future generations could be enshrined in the universal declaration of human rights.

The very fact that humanity has not yet gone extinct might also lead to neglect of extinction threats. However, this would be a mistake. We may just have been lucky to date, and changes to our situation including new technological developments can shift the odds.

Some existential risks are new (such as AI and synthetic biology) and it may take time for them to filter through to political discussions. However, given the potentially long time-lag from substantial and wide-ranging discussion to effective mitigation, this does not mean that attention can be deferred.

International solutions

Early research on existential risks focused on the kinds of threats listed above as isolated exogenous events. However, these hazards cause harm because human societies are vulnerable to harm. Also, large scale risks are inextricably linked to governance failures, they are not merely challenges for governments to overcome. It is not clear that we are developing, deploying, or governing our technology with enough wisdom. This means that as well as implementing safeguards, we should also expect safety systems to fail and have a backup plan to mitigate the impact and survive these catastrophes if prevention fails (see below).

We are right to continue to be very concerned about nuclear war and major asteroid/comet impacts and should try much harder to prevent them. However, major Earth-impacts (although able to strike at any time) are extremely low probability events. Therefore, we ought to be more concerned over perhaps a five- to ten-year period, with developments in synthetic biology and AI. The power of these technologies is advancing rapidly, and we may need important norms and international regulations to prevent dangerous use by states, institutions or individuals.

There are four obvious things that member nations could lobby the UN to do:

  1. Ensure that relevant bodies exist at the UN, similar to the UN Office for Disarmament Affairs (nuclear weapons), or the Committee for the Peaceful Uses of Outer Space (asteroid/comet impacts), to study and effect mitigation, and to coordinate the response to each specific risk.
  2. Ensure there is an overarching body on existential risk across these committees that addresses existential risk as a category and focuses on vulnerabilities and resilience, rather than any single particular risks. This is important because the probability, magnitude and tractability of each threat vary, and resource allocation must be prioritised. By taking an approach across a portfolio of risks, and working on quantitative risk assessments, which account for hazards and vulnerabilities, this body would then be able to recommend which risks justify greater or lesser immediate resources.
  3. Enshrine the rights of future generations: the UN Human Rights Council might consider options for approaching the rights of future generations. Any such rights, should they be deemed relevant and possibly enshrined in the UN Declaration of Human Rights, could have a significant effect in guiding mitigation action across UN member nations.
  4. Develop a convention against omnicide, and any technology that could facilitate omnicide (such as possession of more than 100 nuclear weapons, possession of particular types of bioweapons, development of environmentally devouring nanomaterials, human germline manipulations causing sterilization, and so on).

National solutions

Given the immense consequences of existential threats, international governance should clearly expend some resources to study how to prevent and mitigate these threats. However, organisations such as the UN arguably have a chequered record of responding to crises. Also, some existential threats may not require a global response. Therefore, national governments, communities and individuals should all do what they can to help mitigate the threat (eg, the US Government unilaterally invests in asteroid detection).

At the level of national government, dedicated departments can study and monitor a portfolio of catastrophic risk, allocating resources to those threats with the largest expected impact (on the basis of probability, magnitude, tractability and neglectedness). In February 2020 we published a paper on AI that discusses this approach, in the context of New Zealand.

Prevention, resilience and recovery

Government action should address prevention, resilience and recovery. Prevention might include multilateral disarmament negotiations, revisions to the International Health Regulations to ensure the world is prepared for a catastrophic pandemic, and regulation and oversight to ensure the safety of technologies such as AI, nanotechnology, geoengineering, and synthetic biology. Prevention might require very rapid action at the time of catastrophe. With COVID-19, New Zealand had the luxury of learning from other nations and imposed border controls just in time. Future catastrophe could strike a country like New Zealand first, and so rehearsal, simulation, and walk throughs of key actions are needed ahead of time.

Resilience could include economic preparedness. In the case of New Zealand, the Earthquake Commission model could be enhanced and extended to all catastrophic threats. An investment of 0.5% of GDP per annum could provide in the order of NZ$100 billion per generation to deal with unprecedented catastrophe and could have been accessed for the COVID-19 recovery. Pandemic reinsurance products briefly existed. These married superannuation funds (which save on pay-outs when there is a lot of death) with businesses (which suffer losses during pandemics). However, these no-brainer products were not popular with businesses that clearly felt pandemic insurance was not needed – but such thinking may now be changing. Many other creative ways to fund catastrophe may exist.

Resilience can also be built at the level of individuals, communities, and local governments. Researching and implementing strategies to help individuals and towns ensure food production in a world with a period of reduced sunlight would provide resilience against nuclear winter, supervolcanic eruption and asteroid/comet impact.

Recovery might hinge on some region or population avoiding a global catastrophe and being well-positioned to re-seed the Earth with people, technology and know-how. Partitioning the population to escape a catastrophic pandemic or facilitating survival in islands geographically most likely to endure a period of reduced sunlight could help. International law might need to be addressed. For example, the International Health Regulations actively deter restrictions to travel and trade to combat pandemic disease. This may need to change to empower island nations to close their borders and provide a reservoir of human capital and technological know-how to rebuild civilisation after a catastrophe.

Summary

Existential risks appear neglected by international governance. COVID-19 shows that we must invest time and resources to understand large scale risks. We must also begin preparations to mitigate the most general effects of these threats. This includes implementing appropriate oversight and safety engineering of potentially dangerous technology, building resilience to survive a world with a period of reduced sunlight, and planning to partition humanity so that risks cannot spread to every last grouping of humans.

We risk being limited by our naïvety of many complex processes and may require new methodology and cross-disciplinary work to evaluate these threats. Governments would do well to begin by bringing the full range of domain experts to the table.

Further Reading

  1. Boyd M, Wilson N. Existential Risks to Humanity Should Concern International Policymakers and More Could Be Done in Considering Them at the International Governance Level. Risk Analysis. 2020; online first, doi: 10.1111/risa.13566.
  2. Boyd M, Wilson N. Existential Risks: New Zealand needs a method to agree on a value framework and how to quantify future lives at risk. Policy Quarterly. 2018;14(3):58–65.
  3. Ord T. The Precipice: Existential Risk and the Future of Humanity: Bloomsbury; 2020.
  4. Cotton-Barratt O, Daniel M, Sandberg A. Defence in Depth Against Human Extinction: Prevention, Response, Resilience, and Why They All Matter. Global Policy. 2020: doi: 10.1111/758-5899.12786.

Nuclear insanity has never been worse

nuclear_winter_podcast-1030x466

Donald Trump has just announced a likely build up of US nuclear capability

The threat of nuclear war has probably never been higher, and continues to grow. Given emotional human nature, cognitive irrationality and distributed authority to strike, we have merely been lucky to avoid nuclear war to date.

These new moves without a doubt raise the threat of a human extinction event in the near future. The reasons why are explained in a compelling podcast by Daniel Ellsberg

Ellsberg (the leaker of the Pentagon Papers that ended the Nixon presidency) explains the key facts.  Contemporary modelling shows the likelihood of a nuclear winter is high if more than a couple of hundred weapons are detonated. Previous Cold War modelling ignored the smoke from burning radioactive fires, and so vastly underestimated the risk.

On the other hand, detonation of a hundred or so warheads poses low or no risk of nuclear winter (merely catastrophic destruction). As such, and as nuclear strategist Ellsberg forcefully argues, the only strategically relevant nuclear weapons are those on submarines. This is because they cannot be targeted by pre-emptive strikes, and yet still (with n = 300 or so) provide the necessary deterrence.

Therefore, land-based ICBMs are of no strategic value whatsoever, and merely provide additional targets for additional weapons, thereby pushing the nuclear threat from the deterrence/massive destruction game into the human extinction game. This is totally unacceptable.

Importantly, Ellsberg further argues that the reason the US is so determined to continue to maintain and build nuclear weapons is because of the billions of dollars that it generates in business for Lockhead Martin, Boeing, etc. We are escalating the risk of human extinction in exchange for economic growth.

John Bolton, Trump’s National Security Advisor, is corrupted by the nuclear lobbyists and stands to gain should capabilities be expanded.

There is no military justification for more than a hundred or so nuclear weapons (China’s nuclear policy reflects this – they are capable of building many thousands, but maintain only a fraction of this number). An arsenal of a hundred warheads is an arsenal that cannot destroy life on planet Earth. If these are on submarines they are difficult to target. Yet perversely we sustain thousands of weapons, at great risk to our own future.

The lobbying for large nuclear arsenals must stop. The political rhetoric that this is for our own safety and defence must stop. The drive for profit above all else must stop. Our children’s future depends on it.

%d bloggers like this: