HOME
        TheInfoList






Placard against omnicide, at Extinction Rebellion (2018).

Some philosophers, among them the antinatalist David Benatar, animal rights activist Steven Best and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing for example the omnicidal nature of human civilization.[85][86][87]

Research

Psychologist Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons

Some philosophers, among them the antinatalist David Benatar, animal rights activist Steven Best and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing for example the omnicidal nature of human civilization.[85][86][87]

Research

Psychologist Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed".[88][89] As of 2020, the Biological Weapons Convention organization has an annual budget of US$1.4 million.Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed".[88][89] As of 2020, the Biological Weapons Convention organization has an annual budget of US$1.4 million.[90]

Although existential risks are less manageable by individuals than, e.g. health risks, according to Ken Olum, Joshua Knobe, and

Although existential risks are less manageable by individuals than, e.g. health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."[91]

Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute (est. 2011).

Jean-Baptiste Cousin de Grainville's 1805 Le dernier homme (The Last Man), which depicts human extinction due to infertility, is considered the first modern apocalyptic novel and credited with launching the genre.[92] Other notable early works include Mary Shelley's 1826 The Last Man, depicting human extinction caused by a pandemic, and Olaf Stapledon's 1937 Star Maker, "a comparative study of omnicide".[3]

Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[93]The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[93][94] A threat of human extinction, such as through a technological singularity (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide.[95] Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.[96]