Priority areas

Closing existential risks

Share

Table of Contents

The areas we have now gone through – artificial intelligence, biohazards, extreme climate change and nuclear weapons – are given high priority because they could have enormous consequences, at worst threatening our survival. At the same time, the calculations are uncertain. There could be other, better ways to contribute to a positive future, for example if the probability of these existential disasters is lower than we thought. In many cases, we are also unsure of how we can effectively reduce the risks. These uncertainties justify why the next area seems promising to work on.

Read more

Our current view of the world's most pressing problems , article by 80,000 HoursThe case for reducing extinction risk , article by 80,000 HoursThe Precipice , book by Toby OrdInterview with Toby Ord in the 80,000 Hours podcast , Global Catastrophic Risks 2020 , a report by the Global Challenges FoundationExistential Risk , a report by the Future of Humanity Institute
>