David talks to Martin Rees about how we should evaluate the greatest threats facing the human species in the twenty-first century. Does the biggest danger come from bio-terror or bio-error, climate change, nuclear war or AI? And what prospects does space travel provide for a post-human future?
Existential risk is risk that cascades globally and is a severe setback to civilization. We are now so interconnected and so empowered as a species that humans could be responsible for this kind of destruction.
There are four categories of existential risk: climate change, bioterror/bioerror, nuclear weapons, and AI/new technology.
These threats are human-made. Solving them is also our responsibility.
Mentioned in this episode:
And as ever, recommended reading curated by our friends at the LRB can be found here: lrb.co.uk/talking