Optimism about the Future of Humanity: a conference on existential risks

In 1826 Mary Shelly crafted a vision of humanity’s end in ‘The Last Man’. Depicting a world that persists, indifferent to the demise of our species. The end came at the hands a pandemic, spread by the human technologies of trade and news.

Since the construction of nuclear weapons in 1945 humanity has wielded technological power of extreme destruction, and expert consensus is that the greatest threat to humans are humans themselves.

But given that we are the threat, there is also cause for much hope. Humans are self-reflexive and can change behaviour. Technology has raised the standard of living and human wellbeing worldwide, has provided the tools to escape the Covid-19 pandemic, and promises the foundation for a flourishing future.

Provided we govern and wield technology with appropriate wisdom.

The Existential Risk Observatory, founded in 2021 in the Netherlands, is the latest in a series of global institutions concerned for humanity’s future and with a mission to ensure a thriving global society immune from existential threats.

Driven by optimism for our collective future the Observatory convened a conference on existential risks and invited speakers from around the world.

I had the privilege of presenting my take on biological threats, drawing on research I’d undertaken in conjunction with Nick Wilson of the University of Otago, and others, prior to Covid-19, as well as lessons from New Zealand’s experience with Covid-19, and international research on biological threats.

You can watch my presentation by clicking this link (Session two, talk from 25:10, Q&A from 1:08:55).

Below, I’ve provided the full menu of talks at the conference, which includes:

  • artificial intelligence
  • climate change
  • nuclear weapons
  • biological threats
  • policy approaches

Existential Risk Observatory (Netherlands) Conference on Existential Risks

Session one (7 October 2021)

0:00                 Introduction to the Conference

17:45               Power Hour (general discussions of the conference’s themes)

1:20:45            Climate Change – Ingmar Rentzhog (Founder/CEO We Don’t Have Time)

2:47:34            Existential Risks – Simon Friederich (University of Groningen)

3:48:00            Artificial Intelligence – Roman Yampolskiy (Louisville University)

Session two (8 October 2021)

0:00                 Introduction to Session Two

25:10               Biological Risks – Matt Boyd (Adapt Research Ltd, New Zealand)

1:26:15            Policy – Rumtin Sepasspour (Cambridge Centre of Study for Existential Risk)

2:52:25            Nuclear Weapons – Susi Snyder (PAX, Nobel Peace Laureate)

3:56:29            Artificial Intelligence Policy – Claire Boine (Harvard & Future of Life Institute)

As Rumtin Sepasspour (Research Affiliate, Cambridge University) noted in his presentation, governments are key stakeholders in the quest for immunity from existential risk, particularly those that arise from accidental or deliberate use of technology. Governments should look at existential risks as a set to be analysed, prioritised and mitigated.

In our quest to understand, prevent, prepare and respond to existential threats every country should hold these meetings of diverse stakeholders to share knowledge and ideas for successfully navigating the period where our technological power outstrips our institutional wisdom.

A new report from the Secretary General of the United Nations ‘Our Common Agenda’, calls on nations to develop foresight and futures capability under an umbrella of coordinated global action.

An very good informed summary and discussion of the UN report can be read here.

Nuclear insanity has never been worse

nuclear_winter_podcast-1030x466

Donald Trump has just announced a likely build up of US nuclear capability

The threat of nuclear war has probably never been higher, and continues to grow. Given emotional human nature, cognitive irrationality and distributed authority to strike, we have merely been lucky to avoid nuclear war to date.

These new moves without a doubt raise the threat of a human extinction event in the near future. The reasons why are explained in a compelling podcast by Daniel Ellsberg

Ellsberg (the leaker of the Pentagon Papers that ended the Nixon presidency) explains the key facts.  Contemporary modelling shows the likelihood of a nuclear winter is high if more than a couple of hundred weapons are detonated. Previous Cold War modelling ignored the smoke from burning radioactive fires, and so vastly underestimated the risk.

On the other hand, detonation of a hundred or so warheads poses low or no risk of nuclear winter (merely catastrophic destruction). As such, and as nuclear strategist Ellsberg forcefully argues, the only strategically relevant nuclear weapons are those on submarines. This is because they cannot be targeted by pre-emptive strikes, and yet still (with n = 300 or so) provide the necessary deterrence.

Therefore, land-based ICBMs are of no strategic value whatsoever, and merely provide additional targets for additional weapons, thereby pushing the nuclear threat from the deterrence/massive destruction game into the human extinction game. This is totally unacceptable.

Importantly, Ellsberg further argues that the reason the US is so determined to continue to maintain and build nuclear weapons is because of the billions of dollars that it generates in business for Lockhead Martin, Boeing, etc. We are escalating the risk of human extinction in exchange for economic growth.

John Bolton, Trump’s National Security Advisor, is corrupted by the nuclear lobbyists and stands to gain should capabilities be expanded.

There is no military justification for more than a hundred or so nuclear weapons (China’s nuclear policy reflects this – they are capable of building many thousands, but maintain only a fraction of this number). An arsenal of a hundred warheads is an arsenal that cannot destroy life on planet Earth. If these are on submarines they are difficult to target. Yet perversely we sustain thousands of weapons, at great risk to our own future.

The lobbying for large nuclear arsenals must stop. The political rhetoric that this is for our own safety and defence must stop. The drive for profit above all else must stop. Our children’s future depends on it.