Priority areas

Existential risks

Share

Table of Contents

With the advent of nuclear weapons, a few world leaders gained for the first time the capacity to kill hundreds of millions of people with the push of a button. A subsequent nuclear winter would also make farming difficult and the death toll would likely rise further. Since then, escalating climate change, engineered viruses and artificial intelligence have been added to the list of catastrophic risks we face.

Few people who want to improve the world focus on risks that threaten the existence of humanity. Many associate such thoughts with science fiction. But the fact is that we are exposed to existential risks to an extent that is too great to ignore. There are several examples from history when actors came close to launching nuclear weapons that could have triggered a nuclear war. There have also been several incidents where infectious agents leaked from research labs. If this happens with really dangerous pathogens, we could end up with pandemics far worse than covid-19.

Oxford researcher Toby Ord estimates the risk that we will suffer an existential disaster in the next hundred years at one in six. Like rolling a dice.

In disaster scenarios, such risks could cause billions of deaths. Some of them may even threaten to permanently end human civilization. In addition to the obvious tragedy it would be for us living now, it would also mean the loss of all future generations. This may sound doomsday and hard to imagine, but it probably says more about our psychology than reality. In the book The Precipice , Oxford researcher Toby Ord presents his best estimate of the risk we run of suffering an existential catastrophe in the next hundred years. It’s a whole sixth – like rolling a die. Even events with a lower probability than that are worth worrying about because the consequences, should they occur, would be so devastating.

Furthermore, some of these risks are widely overlooked. Each year, less than $50 million goes to work in AI security and global biohazards. It can be compared to the billions of dollars allocated to more well-known causes such as education, work against terrorism and poverty alleviation in rich countries. In other words, issues related to AI safety and biorisk could be 1,000 times more overlooked. A relatively small number of additional people could therefore contribute to significantly reducing the risks.

Research has made us increasingly convinced that one of the most important challenges facing our generation is to reduce the risk of global disasters and increase the chance for a positive future. The fact that so much is at stake while these issues are relatively overlooked means that they are generally considered to offer the best opportunities to make a big difference today.

We will now briefly go through these pressing challenges one by one. At the end of each section there are links for those who want to delve deeper and read more about concrete job opportunities.

 

Read more

Our current view of the world's most pressing problems , article by 80,000 HoursThe case for reducing extinction risk , article by 80,000 HoursThe Precipice , book by Toby OrdInterview with Toby Ord in the 80,000 Hours podcast , Nuclear security , article by 80,000 HoursGlobal Catastrophic Risks 2020 , Global Challenges Foundation reportExistential Risk , Future of Humanity Institute reportHere Be Dragons , book by Olle HäggströmOn the Future , book by Lord Martin Rees
>