Unaligned AI

Future artificial intelligence (AI) with goals that may be different from ours

Man-made pandemics

Genetically modified pandemics leading to extinction

Climate change

Extreme climate change scenarios causing complete extinction

Nuclear war

Extreme nuclear war causing complete extinction

Total natural risk

The sum of all natural extinction risks, such as supervolcanoes and asteroids

Other man-made risks

Other man-made extinction risks, including technologies still to be invented

 

Their estimated chance of occurence in the next hundred years is presented below.
 
It can be seen that:
  • These existential risks are unacceptably high.
  • Man-made extinction risks, such as unaligned AI and man-made pandemics, are a lot riskier than natural ones. These risks are preventable in principle.

The existential risk percentages shown in this graph are far from exact, since our knowledge of existential risks is limited. Also, at this moment, there are only two quantified scientific existential risk estimates¹² available to our knowledge, and they are both from the same institute (Future of Humanity Institute, Oxford University).

However, because of the seriousness of human extinction, we think that even a slight scientific indication is sufficient reason to make existential risks more widely known and discussed. We also believe we should urgently conduct more existential risk research, at diverse institutes with researchers from multiple fields, so as to improve the accuracy and robustness of these findings as soon as possible.

Even the amount of evidence we currently have convinces us that existential risk has increased significantly because of human action in potentially preventable ways. Therefore, we should:

  1. Find out more accurately and robustly how high the existential risks are and how to decrease them, by pursuing more existential risk research at different institutes.
  2. Globally prioritize preventing these risks from materializing.
¹Ord, Toby, Oxford University. The precipice: existential risk and the future of humanity. Hachette Books, 2020.
²Sandberg, Anders & Bostrom, Nick. Global Catastrophic Risks Survey, Technical Report #2008-1, Future of Humanity Institute, Oxford University: pp. 1-5.