Existential Risks
   HOME

TheInfoList



OR:

A global catastrophic risk or a doomsday scenario is a hypothetical future event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause
human extinction Human extinction, also known as omnicide, is the hypothetical end of the human species due to either natural causes such as population decline from sub-replacement fertility, an asteroid impact, or large-scale volcanism, or to anthropogenic ...
or permanently and drastically curtail humanity's potential is known as an "existential risk." Over the last two decades, a number of academic and non-profit organizations have been established to research global catastrophic and existential risks, formulate potential mitigation measures and either advocate for or implement these measures.


Definition and classification


Defining global catastrophic risks

The term global catastrophic risk "lacks a sharp definition", and generally refers (loosely) to a risk that could inflict "serious damage to human well-being on a global scale". Humanity has suffered large catastrophes before. Some of these have caused serious damage but were only local in scope—e.g. the
Black Death The Black Death (also known as the Pestilence, the Great Mortality or the Plague) was a bubonic plague pandemic occurring in Western Eurasia and North Africa from 1346 to 1353. It is the most fatal pandemic recorded in human history, causi ...
may have resulted in the deaths of a third of Europe's population, 10% of the global population at the time. Some were global, but were not as severe—e.g. the
1918 influenza pandemic The 1918–1920 influenza pandemic, commonly known by the misnomer Spanish flu or as the Great Influenza epidemic, was an exceptionally deadly global influenza pandemic caused by the H1N1 influenza A virus. The earliest documented case was ...
killed an estimated 3–6% of the world's population. Most global catastrophic risks would not be so intense as to kill the majority of life on earth, but even if one did, the ecosystem and humanity would eventually recover (in contrast to ''existential risks''). Similarly, in '' Catastrophe: Risk and Response'',
Richard Posner Richard Allen Posner (; born January 11, 1939) is an American jurist and legal scholar who served as a federal appellate judge on the U.S. Court of Appeals for the Seventh Circuit from 1981 to 2017. A senior lecturer at the University of Chica ...
singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional", scale. Posner highlights such events as worthy of special attention on cost–benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.


Defining existential risks

Existential risks are defined as "risks that threaten the destruction of humanity's long-term potential." The instantiation of an existential risk (an ''existential catastrophe'') would either cause outright human extinction or irreversibly lock in a drastically inferior state of affairs. Existential risks are a sub-class of global catastrophic risks, where the damage is not only ''global'' but also ''terminal'' and ''permanent,'' preventing recovery and thereby affecting both current and all future generations.


Non-extinction risks

While extinction is the most obvious way in which humanity's long-term potential could be destroyed, there are others, including ''unrecoverable'' ''collapse'' and ''unrecoverable'' ''dystopia''. A disaster severe enough to cause the permanent, irreversible collapse of human civilisation would constitute an existential catastrophe, even if it fell short of extinction. Similarly, if humanity fell under a totalitarian regime, and there were no chance of recovery then such a dystopia would also be an existential catastrophe.Bryan Caplan (2008).
The totalitarian threat
. ''Global Catastrophic Risks'', eds. Bostrom & Cirkovic (Oxford University Press): 504–519.
Bryan Caplan Bryan Douglas Caplan (born April 8, 1971) is an American economist and author. Caplan is a professor of economics at George Mason University, research fellow at the Mercatus Center, adjunct scholar at the Cato Institute, and former contributor ...
writes that "perhaps an eternity of totalitarianism would be worse than extinction". (
George Orwell Eric Arthur Blair (25 June 1903 – 21 January 1950), better known by his pen name George Orwell, was an English novelist, essayist, journalist, and critic. His work is characterised by lucid prose, social criticism, opposition to totalitar ...
's novel ''
Nineteen Eighty-Four ''Nineteen Eighty-Four'' (also stylised as ''1984'') is a dystopian social science fiction novel and cautionary tale written by the English writer George Orwell. It was published on 8 June 1949 by Secker & Warburg as Orwell's ninth and final ...
'' suggests an example.) A dystopian scenario shares the key features of extinction and unrecoverable collapse of civilisation—before the catastrophe, humanity faced a vast range of bright futures to choose from; after the catastrophe, humanity is locked forever in a terrible state.


Potential sources of risk

Potential global catastrophic risks are conventionally classified as anthropogenic or non-anthropogenic hazards.  Examples of non-anthropogenic risks are an asteroid
impact event An impact event is a collision between astronomical objects causing measurable effects. Impact events have physical consequences and have been found to regularly occur in planetary systems, though the most frequent involve asteroids, comets or me ...
, a supervolcanic
eruption Several types of volcanic eruptions—during which lava, tephra (ash, lapilli, volcanic bombs and volcanic blocks), and assorted gases are expelled from a volcanic vent or fissure—have been distinguished by volcanologists. These are often ...
, a lethal gamma-ray burst, a
geomagnetic storm A geomagnetic storm, also known as a magnetic storm, is a temporary disturbance of the Earth's magnetosphere caused by a solar wind shock wave and/or cloud of magnetic field that interacts with the Earth's magnetic field. The disturbance that d ...
destroying electronic equipment, natural long-term
climate change In common usage, climate change describes global warming—the ongoing increase in global average temperature—and its effects on Earth's climate system. Climate change in a broader sense also includes previous long-term changes to E ...
, hostile
extraterrestrial life Extraterrestrial life, colloquially referred to as alien life, is life that may occur outside Earth and which did not originate on Earth. No extraterrestrial life has yet been conclusively detected, although efforts are underway. Such life might ...
, or the predictable
Sun The Sun is the star at the center of the Solar System. It is a nearly perfect ball of hot plasma, heated to incandescence by nuclear fusion reactions in its core. The Sun radiates this energy mainly as light, ultraviolet, and infrared radi ...
transforming into a
red giant star A red giant is a luminous giant star of low or intermediate mass (roughly 0.3–8 solar masses ()) in a late phase of stellar evolution. The outer atmosphere is inflated and tenuous, making the radius large and the surface temperature around or ...
engulfing the Earth. Anthropogenic risks are those caused by humans and include those related to technology, governance, and climate change. Technological risks include the creation of
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech re ...
misaligned with human goals,
biotechnology Biotechnology is the integration of natural sciences and engineering sciences in order to achieve the application of organisms, cells, parts thereof and molecular analogues for products and services. The term ''biotechnology'' was first used b ...
, and
nanotechnology Nanotechnology, also shortened to nanotech, is the use of matter on an atomic, molecular, and supramolecular scale for industrial purposes. The earliest, widespread description of nanotechnology referred to the particular technological goal o ...
. Insufficient or malign
global governance Global governance refers to institutions that coordinate the behavior of transnational actors, facilitate cooperation, resolve disputes, and alleviate collective action problems. Global governance broadly entails making, monitoring, and enfor ...
creates risks in the social and political domain, such as
global war A world war is an international conflict which involves all or most of the world's major powers. Conventionally, the term is reserved for two major international conflicts that occurred during the first half of the 20th century, World WarI (1914 ...
and
nuclear holocaust A nuclear holocaust, also known as a nuclear apocalypse, nuclear Armageddon, or atomic holocaust, is a theoretical scenario where the mass detonation of nuclear weapons causes globally widespread destruction and radioactive fallout. Such a scenar ...
,
bioterrorism Bioterrorism is terrorism involving the intentional release or dissemination of biological agents. These agents are bacteria, viruses, insects, fungi, and/or toxins, and may be in a naturally occurring or a human-modified form, in much the same ...
using
genetically modified organism A genetically modified organism (GMO) is any organism whose genetic material has been altered using genetic engineering techniques. The exact definition of a genetically modified organism and what constitutes genetic engineering varies, with ...
s,
cyberterrorism Cyberterrorism is the use of the Internet to conduct violent acts that result in, or threaten, the loss of life or significant bodily harm, in order to achieve political or ideological gains through threat or intimidation. Acts of deliberate, la ...
destroying
critical infrastructure Critical infrastructure (or critical national infrastructure (CNI) in the UK) is a term used by governments to describe assets that are essential for the functioning of a society and economy – the infrastructure. Most commonly associated wit ...
like the
electrical grid An electrical grid is an interconnected network for electricity delivery from producers to consumers. Electrical grids vary in size and can cover whole countries or continents. It consists of:Kaplan, S. M. (2009). Smart Grid. Electrical Power ...
, or the failure to manage a natural or engineered
pandemic A pandemic () is an epidemic of an infectious disease that has spread across a large region, for instance multiple continents or worldwide, affecting a substantial number of individuals. A widespread endemic (epidemiology), endemic disease wi ...
. Global catastrophic risks in the domain of
earth system governance Earth system governance is a recently developed paradigm that builds on earlier notions of environmental policy and nature conservation, but puts these into the broader context of human-induced transformations of the entire earth system. The in ...
include
global warming In common usage, climate change describes global warming—the ongoing increase in global average temperature—and its effects on Earth's climate system. Climate change in a broader sense also includes previous long-term changes to E ...
, environmental degradation, Extinction, extinction of species, famine as a result of Social development theory, non-equitable resource distribution, human overpopulation, Harvest#Crop failure, crop failures, and non-sustainable agriculture.


Methodological challenges

Research into the nature and mitigation of global catastrophic risks and existential risks is subject to a unique set of challenges and, as a result, is not easily subjected to the usual standards of scientific rigour. For instance, it is neither feasible nor ethical to study these risks experimentally. Carl Sagan expressed this with regards to nuclear war: "Understanding the long-term consequences of nuclear war is not a problem amenable to experimental verification". Moreover, many catastrophic risks change rapidly as technology advances and background conditions, such as geopolitical conditions, change. Another challenge is the general difficulty of accurately predicting the future over long timescales, especially for anthropogenic risks which depend on complex human political, economic and social systems. In addition to known and tangible risks, unforeseeable black swan theory, black swan extinction events may occur, presenting an additional methodological problem.


Lack of historical precedent

Humanity has never suffered an existential catastrophe and if one were to occur, it would necessarily be unprecedented. Therefore, existential risks pose unique challenges to prediction, even more than other long-term events, because of Anthropic principle, observation selection effects. Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history. These Anthropic principle, anthropic issues may partly be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology. To understand the dynamics of an unprecedented, unrecoverable global civilizational collapse (a type of existential risk), it may be instructive to study the various local Societal collapse, civilizational collapses that have occurred throughout human history. For instance, civilizations such as the Roman Empire have ended in a loss of centralized governance and a major civilization-wide loss of infrastructure and advanced technology. However, these examples demonstrate that societies appear to be fairly resilient to catastrophe; for example, Medieval Europe survived the
Black Death The Black Death (also known as the Pestilence, the Great Mortality or the Plague) was a bubonic plague pandemic occurring in Western Eurasia and North Africa from 1346 to 1353. It is the most fatal pandemic recorded in human history, causi ...
without suffering anything resembling a civilization collapse despite losing 25 to 50 percent of its population.


Incentives and coordination

There are economic reasons that can explain why so little effort is going into existential risk reduction. It is a global public good, so we should expect it to be undersupplied by markets. Even if a large nation invests in risk mitigation measures, that nation will enjoy only a small fraction of the benefit of doing so. Furthermore, existential risk reduction is an ''intergenerational'' global public good, since most of the benefits of existential risk reduction would be enjoyed by future generations, and though these future people would in theory perhaps be willing to pay substantial sums for existential risk reduction, no mechanism for such a transaction exists.


Cognitive biases

Numerous cognitive biases can influence people's judgment of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect. Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as willing to prevent the deaths of 200,000 or 2,000 birds. Similarly, people are often more concerned about threats to individuals than to larger groups. Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:
Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".
All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the absence of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic principle, anthropic effects. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."


Proposed mitigation


Multi-layer defense

Defence in depth (non-military), Defense in depth is a useful framework for categorizing risk mitigation measures into three layers of defense: # ''Prevention'': Reducing the probability of a catastrophe occurring in the first place. Example: Measures to prevent outbreaks of new highly infectious diseases. # ''Response'': Preventing the scaling of a catastrophe to the global level. Example: Measures to prevent escalation of a small-scale nuclear exchange into an all-out nuclear war. # ''Resilience'': Increasing humanity's resilience (against extinction) when faced with global catastrophes. Example: Measures to increase food security during a nuclear winter. Human extinction is most likely when all three defenses are weak, that is, "by risks we are unlikely to prevent, unlikely to successfully respond to, and unlikely to be resilient against". The unprecedented nature of existential risks poses a special challenge in designing risk mitigation measures since humanity will not be able to learn from a track record of previous events.


Funding

Some researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on ''Star Trek'', snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed". As of 2020, the Biological Weapons Convention organization had an annual budget of US$1.4 million.


Survival planning

Some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving a global disaster. Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes. Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition. In 2022, a team led by David Denkenberger modeled the cost-effectiveness of resilient foods to AI alignment, artificial general intelligence (AGI) safety and found "∼98-99% confidence" for a higher marginal impact of work on resilient foods. Some survivalists stock Retreat (survivalism), survival retreats with multiple-year food supplies. The Svalbard Global Seed Vault is buried inside a mountain on an island in the Arctic. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world's crops. The surrounding rock is (as of 2015) but the vault is kept at by refrigerators powered by locally sourced coal. More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.


Global catastrophic risks and global governance

Insufficient
global governance Global governance refers to institutions that coordinate the behavior of transnational actors, facilitate cooperation, resolve disputes, and alleviate collective action problems. Global governance broadly entails making, monitoring, and enfor ...
creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks. In absence or anticipation of global governance, national governments can act individually to better understand, mitigate and prepare for global catastrophes.


Climate emergency plans

In 2018, the Club of Rome called for greater climate change action and published its Climate Emergency Plan, which proposes ten action points to limit global average temperature increase to 1.5 degrees Celsius. Further, in 2019, the Club published the more comprehensive Planetary Emergency Plan. There is evidence to suggest that collectively engaging with the emotional experiences that emerge during contemplating the vulnerability of the human species within the context of climate change allows for these experiences to be adaptive. When collective engaging with and processing emotional experiences is supportive, this can lead to growth in resilience, psychological flexibility, tolerance of emotional experiences, and community engagement.


Space colonization

Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario. Solutions of this scope may require megascale engineering. Astrophysicist Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the Space and survival, chance of human survival from planet-wide events such as global thermonuclear war. Billionaire Elon Musk writes that humanity must become a multiplanetary species in order to avoid extinction. Musk is using his company SpaceX to develop technology he hopes will be used in the colonization of Mars.


Moving the Earth

In a few billion years, the Sun will expand into a red giant, swallowing the Earth. This can be avoided by moving the Earth farther out from the Sun, keeping the temperature roughly constant. That can be accomplished by tweaking the orbits of comets and asteroids so they pass close to the Earth in such a way that they add energy to the Earth's orbit. Since the Sun's expansion is slow, roughly one such encounter every 6,000 years would suffice.


Skeptics and opponents

Psychologist Steven Pinker has called existential risk a "useless category" that can distract from real threats such as climate change and nuclear war.


Organizations

The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated "grey goo". Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia. Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute (est. 2000), which aims to reduce the risk of a catastrophe caused by artificial intelligence, with donors including Peter Thiel and Jed McCaleb. The Nuclear Threat Initiative (est. 2001) seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event. It maintains a nuclear material security index. The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe. Most of the research money funds projects at universities. The Global Catastrophic Risk Institute (est. 2011) is a US-based non-profit, non-partisan think tank founded by Seth Baum and Tony Barrett. GCRI does research and policy work across various risks, including artificial intelligence, nuclear war, climate change, and asteroid impacts. The Global Challenges Foundation (est. 2012), based in Stockholm and founded by László Szombatfalvy, Laszlo Szombatfalvy, releases a yearly report on the state of global risks. The Future of Life Institute (est. 2014) works to reduce extreme, large-scale risks from transformative technologies, as well as steer the development and use of these technologies to benefit all life, through grantmaking, policy advocacy in the United States, European Union and United Nations, and educational outreach. Elon Musk, Vitalik Buterin and Jaan Tallinn are some of its biggest donors. The Center on Long-Term Risk (est. 2016), formerly known as the Foundational Research Institute, is a British organization focused on reducing risks of astronomical suffering (s-risks) from emerging technologies. University-based organizations include the Future of Humanity Institute (est. 2005) which researches the questions of humanity's long-term future, particularly existential risk. It was founded by Nick Bostrom and is based at Oxford University. The Centre for the Study of Existential Risk (est. 2012) is a Cambridge University-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare. All are man-made risks, as Huw Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us." Stephen Hawking was an acting adviser. The Millennium Alliance for Humanity and the Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academia in the humanities. It was founded by Paul R. Ehrlich, Paul Ehrlich, among others. Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk. The Center for Security and Emerging Technology was established in January 2019 at Georgetown's Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence. They received a grant of 55M USD from Good Ventures as suggested by Open Philanthropy. Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis. GAR helps member states with training and coordination of response to epidemics. The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to Pandemic prevention, prevent and contain naturally generated pandemics at their source. The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.


See also

* Apocalyptic and post-apocalyptic fiction * Artificial intelligence arms race * Cataclysmic pole shift hypothesis * Community resilience * Doomsday cult * Eschatology * Extreme risk * Failed state * Fermi paradox * Foresight (psychology) * Future of Earth * Future of the Solar System * Climate engineering, Geoengineering * Global Risks Report * Great Filter * Holocene extinction * Impact event * List of global issues * Nuclear proliferation * Outside Context Problem * Planetary boundaries * Rare events * ''The Sixth Extinction: An Unnatural History'' (nonfiction book) * Social degeneration * Societal collapse * Speculative evolution: Studying hypothetical animals that could one day inhabit Earth after an existential catastrophe. * Survivalism * Tail risk * ''The Precipice: Existential Risk and the Future of Humanity'' * Timeline of the far future * Ultimate fate of the universe * World Scientists' Warning to Humanity


References


Further reading

* * Corey S. Powell (2000)
"Twenty ways the world could end suddenly"
''Discover Magazine'' * Derrick Jensen (2006) ''Endgame (Derrick Jensen books), Endgame'' (). * Donella Meadows (1972). ''The Limits to Growth'' (). * Edward O. Wilson (2003). ''The Future of Life''. * Jim Holt (philosopher), Holt, Jim, "The Power of Catastrophic Thinking" (review of Toby Ord, ''The Precipice: Existential Risk and the Future of Humanity'', Hachette, 2020, 468 pp.), ''The New York Review of Books'', vol. LXVIII, no. 3 (February 25, 2021), pp. 26–29. Jim Holt (philosopher), Jim Holt writes (p. 28): "Whether you are searching for a cure for cancer, or pursuing a scholarly or artistic career, or engaged in establishing more just institutions, a threat to the future of humanity is also a threat to the significance of what you do." * Huesemann, Michael H., and Joyce A. Huesemann (2011)
''Technofix: Why Technology Won't Save Us or the Environment''
Chapter 6, "Sustainability or Collapse", New Society Publishers, Gabriola Island, British Columbia, Canada, 464 pages (). * Jared Diamond, ''Collapse: How Societies Choose to Fail or Succeed'', Penguin Books, 2005 and 2011 (). * Jean-Francois Rischard (2003)
''High Noon 20 Global Problems, 20 Years to Solve Them''
* Joel Garreau, ''Radical Evolution'', 2005 (). * John A. Leslie (1996). ''The End of the World'' (). * Joseph Tainter, (1990). ''The Collapse of Complex Societies'', Cambridge University Press, Cambridge, UK (). * Martin Rees (2004). ''Our Final Hour, Our Final Hour: A Scientist's warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century—On Earth and Beyond''. * Roger-Maurice Bonnet and Lodewijk Woltjer, ''Surviving 1,000 Centuries Can We Do It?'' (2008), Springer-Praxis Books. * Toby Ord (2020)
The Precipice - Existential Risk and the Future of Humanity
Bloomsbury Publishing.


External links

* *

from ''The Guardian''. Ten scientists name the biggest dangers to Earth and assess the chances they will happen. April 14, 2005.
Humanity under threat from perfect storm of crises – study
''The Guardian''. February 6, 2020.
Annual Reports on Global Risk
by the Global Challenges Foundation
Center on Long-Term Risk

Global Catastrophic Risk Policy
* {{Authority control Existential risk, Survivalism