Human extinction

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Human extinction is the hypothesized end of the human species. Various scenarios have been discussed in science, popular culture and religion (see End time). The scope of this article is existential risks. Humans are very widespread on the Earth, and live in communities that (while interconnected) are capable of some level of basic survival in isolation. Therefore, pandemics and deliberate killing aside, to achieve human extinction the entire planet would have to be rendered uninhabitable, with no opportunity provided or possible for humans to establish a foothold beyond Earth. This would typically be during a mass extinction event, a precedent of which exists in the Permian–Triassic extinction event among other examples.

In the near future, anthropogenic extinction scenarios exist:[1] global nuclear annihilation, total global war, dysgenics, overpopulation,[2] global accidental pandemic, and global warming; besides natural ones: Meteor impact and large scale volcanism; and anthropogenic-natural hybrid events like global warming and catastrophic climate change. Naturally caused extinction scenarios have occurred multiple times in the geologic past although the probability of reoccurrence within the human timescale of the near future is negligibly small. As technology develops, there is a theoretical possibility that humans may be deliberately destroyed by the actions of a nation state, corporation or individual in a form of global suicide attack. There is also a theoretical possibility that technological advancement may resolve or prevent potential extinction scenarios. The emergence of a pandemic of such virulence and infectiousness that very few humans survive the disease is a credible scenario. While not necessarily a human extinction event, this may leave only very small, very scattered human populations that would then evolve in isolation. It is important to differentiate between human extinction and the extinction of all life on Earth. Of possible extinction events, only a pandemic is selective enough to eliminate humanity while leaving the rest of complex life on earth relatively unscathed.

Possible scenarios[edit]

Severe forms of known or recorded disasters[edit]

U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity" if left unchecked is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction.[5]

Long-term habitat threats[edit]

  • In about 1 billion years from now, the Earth's oceans will disappear, due to the Sun brightening. However, well before this, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains.[6] See Future of the Earth.
  • In about 5–6 billion years from now, the Sun will start to become a red giant. The oceans and much of the atmosphere will boil away and the Earth's temperature will rise well above the boiling point of water. About 7–8 billion years from now, the Earth will probably be engulfed by an expanding Sun and destroyed.[7][8]

Population decline[edit]

Scientific accidents[edit]

  • The creators of the first "superintelligent" entity could make a mistake and inadvertently give it goals that lead it to immediately "annihilate" the human race.[3][10]
  • In his book Our Final Hour, Sir Martin Rees claims that without the appropriate regulation, scientific advancement increases the risk of human extinction as a result of the effects or use of new technology. Some examples are provided below.
    • Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).
    • Creation of a "micro black hole" on Earth during the course of a scientific experiment, or other foreseeable scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents. There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at a speed near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.

Scenarios of extraterrestrial origin[edit]

  • Major impact events.
  • Gamma-ray burst in our part of the Milky Way. (Bursts observable in other galaxies are calculated to act as a "sterilizer", and have been used by some astronomers to explain the Fermi paradox.) The lack of fossil record interruptions, and relative distance of the nearest Hypernova candidate make this a long term (rather than imminent) threat.
    • Wolf-Rayet star WR 104, which is 8000 light years from the Sun, may produce a gamma ray burst aimed at the Sun when it goes supernova.
  • Invasion by militarily superior extraterrestrials (see alien invasion) — often considered to be a scenario purely from the realms of science fiction, professional SETI researchers have given serious consideration to this possibility, but conclude that it is unlikely.[2]
    • Gerard O'Neill has cautioned that first contact with alien intelligence may follow the precedent set by historical examples of contact between human civilizations, where the less technologically-advanced civilization has inevitably succumbed to the other civilization, regardless of its intentions.
  • A vacuum phase transition could destroy the universe.

Other[edit]

  • Modification of humans into a new species
    • Technological transition into a posthuman life-form or existence.
    • Biological evolution of humanity into another hominid species. Humans will continue to evolve via traditional natural selection over a period of millions of years, and Homo sapiens may gradually transition into one or more new species.

Perception of human extinction risk[edit]

The threat of nuclear annihilation was a significant concern in the lives of many people from the 1950s through the 1980s.

All past predictions of human extinction have proven to be false; to some, this makes future warnings seem less credible. John von Neumann was probably wrong in having "a certainty"[3] that nuclear war would occur. (Of course, our survival is not, in itself, proof that the chance of a fatal nuclear exchange was low, or that such an event could not occur in the future). Others, such as Nick Bostrom, argue that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects. Bostrom speculates that extinction risk-analysis may be an "overlooked field" because it is both too psychologically troublesome a subject area to be attractive to potential researchers, and because the lack of previous human species extinction events leads a depressed view of the likelihood of it happening under changing future circumstances.

It is possible to do something about dietary or motor-vehicle health threats. It is much harder to know how existential threats should be minimized.[4]

Some Behavioural finance scholars claim that recent evidence is given undue significance in risk analysis. Roughly speaking, "100 year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis-scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented.[5]

Many people believe humanity's intelligence and sense of self preservation offer safe-guards against extinction. They argue that people will find creative ways to overcome potential threats, and will take care of the precautionary principle in attempting dangerous innovations. Others believe that the management of destructive technology is becoming difficult, and that the precautionary principle is often abandoned whenever the reward appears to outweigh the risk.

Shortly before the Trinity nuclear test, one of the project's lead scientists (Teller) speculated that the fission explosion might destroy New Mexico and possibly the world, by causing a reaction in the nitrogen of the atmosphere. Hans Bethe then calculated that such a reaction was theoretically impossible. It is unknown whether the U.S. would have eventually proceeded with the test anyway, had Bethe calculated a small but nonzero risk of destroying the world.

Observations about human extinction[edit]

Dave Raup and Jack Sepkoski claim there is a mysterious twenty-six-million-year periodicity in elevated extinction rates.[12] Based on past extinction rates, Raup and others infer that the average longevity of a typical vertebrate species is 2–4 million years. However, generalist, geographically dispersed species, like humans, may have a lower rate of extinction than those species that require a particular habitat.

Jared Diamond's The Third Chimpanzee estimates that 64% of hunter-gather societies engage in warfare every two years. The combination of inventiveness and the history of violence in humans has been cited as evidence against its long term survival.[6]

Quantifying the Impact[edit]

Carl Sagan argued:[13]

If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss—including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise.

One scholar argues[14] that any risk of human extinction above 1 in 1020 should ethically be considered "unacceptable".

Omnicide[edit]

Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare,[15][16][17] but it can also apply to extinction through means such as global anthropogenic ecological catastrophe.[18]

Omnicide can be considered a subcategory of genocide.[19] Using the concept in this way, one can argue, for example, that:

The arms race is genocidal in intent given the fact that the United States and the Soviet Union are knowingly preparing to destroy each other as viable national and political groups.[20]

This concept of omnicide attempts to raise issues of human agency and moral responsibility in discussions about large-scale social processes like the nuclear arms race. To describe a human extinction scenario as 'omnicidal' is to claim that, if it were to happen, it would result not just from natural, uncontrollable evolutionary forces, or from some random catastrophe like an asteroid impact, but from deliberate choices made by human beings. In this view, such scenarios are preventable, and people whose choices make them more likely to happen should be held morally accountable.

In popular culture[edit]

The book The World Without Us by Alan Weisman deals with a thought experiment on what would happen to the planet and especially human-made infrastructures if humans suddenly disappeared. The Discovery Channel documentary miniseries The Future Is Wild shows the possible future of evolution on Earth without humans. The History Channel's special Life After People examines the possible future of life on Earth without humans, and was made into a series of the same name. The National Geographic Channel's special Aftermath: Population Zero envisions what the world be like if all humans suddenly disappeared. A threat of human extinction drives the plot of innumerable science fiction stories, from When Worlds Collide to 2012. Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and A.I.

See also[edit]

Notes[edit]

^ Von Neumann said it was "absolutely certain (1) that there would be a nuclear war; and (2) that everyone would die in it" (underline added to quote from: The Nature of the Physical Universe – 1979, John Wiley & Sons, ISBN 0-471-03190-9, in H. Putnam's essay The place of facts in a world of values - page 113). This example illustrates why respectable scientists are very reluctant to go on record with extinction predictions: they can never be proven right. (The quotation is repeated by Leslie (1996) on page 26, on the subject of nuclear war annihilation, which he still considered a significant risk – in the mid 1990s.)

^ Although existential risks are less manageable by individuals than health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life then we believe." Source: "Practical application" page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology

^ For research on this, see Psychological science volume 15 (2004): Decisions From Experience and the Effect of Rare Events in Risky Choice. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by Kahneman in "prospect theory" (in their original experiments the likelihood of rare events is overestimated). However, further analysis of the bias has shown that both forms occur: When judging from description people tend to overestimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader overestimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically underestimate its likelihood. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth." (Is Humanity Suicidal? The New York Times Magazine May 30, 1993).

^ Abrupt.org 1996 editorial lists (and condemns) the arguments for human's tendency to self-destruction. In this view, the history of humanity suggests that humans will be the cause of their own extinction. However, others have reached the opposite conclusion with the same data on violence and hypothesize that as societies develop armies and weapons with greater destructive power, they tend to be used less often. It is claimed that this implies a more secure future, despite the development of WMD technology. As such this argument may constitute a form of deterrence theory. Counter-arguments against such views include the following: (1) All weapons ever designed have ultimately been used. States with strong military forces tend to engage in military aggression, (2) Although modern states have so far generally shown restraint in unleashing their most potent weapons, whatever rational control was guaranteed by government monopoly over such weapons becomes increasingly irrelevant in a world where individuals have access to the technology of mass destruction (as proposed in Our Final Hour, for example).

^ ReligiousTolerance.org says that Aum Supreme Truth is the only religion known to have planned Armageddon for non-believers. Their intention to unleash deadly viruses is covered in Our Final Hour, and by Aum watcher, Akihiko Misawa. The Gaia Liberation Front advocates (but is not known to have active plans for) total human genocide, see: GLF, A Modest Proposal. Leslie, 1996 says that Aum's collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.

^ Leslie (1996) discusses the survivorship bias (which he calls an "observational selection" effect on page 139) he says that the a priori certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe." (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, Acta Pysica Polonica B, May 1989).

^ For example, in the essay Why the future doesn't need us, computer scientist Bill Joy argued that human beings are likely to guarantee their own extinction through transhumanism. See: Wired archive, Why the future doesn't need us.

^ For the "West Germany" extrapolation see: Leslie, 1996 (The End of the World) in the "War, Pollution, and disease" chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinciton scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.

^ See estimate of contact's probability at galactic-guide. Former NASA consultant David Brin's lengthy rebuttal to SETI enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." (See full text at SETIleague.org.)

References[edit]

  1. ^ Vinn, O (2014). "Potential incompatibility of inherited behavior patterns with civilization". PublishResearch: 1–3. Retrieved 2014-03-05. 
  2. ^ Niall Firth (18 June 2010). "Human race 'will be extinct within 100 years', claims leading scientist". Daily Mail. Retrieved 28 January 2014. 
  3. ^ a b c Bostrom, Nick (March 2002). "Existential Risks". Journal of Evolution and Technology 9. Retrieved 28 January 2014. 
  4. ^ Anders Sandberg; Milan M. Ćirković (September 9, 2008). "How can we reduce the risk of human extinction?". Bulletin of the Atomic Scientists. Retrieved January 28, 2014. 
  5. ^ Fiorill, Joe (July 29, 2005). "Top U.S. Disease Fighters Warn of New Engineered Pathogens but Call Bioweapons Doomsday Unlikely". Global Security Newswire. Retrieved 10 September 2013. 
  6. ^ Damian Carrington (February 21, 2000). "Date set for desert Earth". BBC News. Retrieved January 28, 2014. 
  7. ^ Clara Moskowitz (February 26, 2008). "Earth's Final Sunset Predicted". space.com. Retrieved January 28, 2014. 
  8. ^ Schröder, K. -P.; Connon Smith, R. (2008). "Distant future of the Sun and Earth revisited". Monthly Notices of the Royal Astronomical Society 386: 155. arXiv:0801.4031. Bibcode:2008MNRAS.386..155S. doi:10.1111/j.1365-2966.2008.13022.x.  edit
  9. ^ Can we be sure the world's population will stop rising?, BBC News, 13 October 2012
  10. ^ Chalmers, David (2010). "The singularity: A philosophical analysis". Journal of Consciousness Studies 17: 9–10. 
  11. ^ Warwick, Kevin (2004). I, Cyborg. University of Illinois Press. ISBN 978-0252072154. 
  12. ^ David M. Raup (1992). Extinction: Bad Genes or Bad Luck. Norton. ISBN 978-0393309270. 
  13. ^ Sagan, Carl (1983). "Nuclear war and climatic catastrophe: Some policy implications". Foreign Affairs 62: 275. 
  14. ^ Tonn, Bruce E. (1 September 2009). "Obligations to future generations and acceptable risks of human extinction". Futures 41 (7): 427–435. doi:10.1016/j.futures.2009.01.009. 
  15. ^ Rose Somerville; John Somerville, introduction (1981). Soviet Marxism and nuclear war : an international debate : from the proceedings of the special colloquium of the XVth World Congress of Philosophy. Greenwood Press. p. 151. ISBN 978-0313225314. 
  16. ^ Goodman, Lisl Marburg; Lee Ann Hoff (1990). Omnicide: The Nuclear Dilemma. New York: Praeger. ISBN 978-0275932985. 
  17. ^ Daniel Landes, ed. (1991). Confronting Omnicide: Jewish Reflections on Weapons of Mass Destruction. Jason Aronson, Inc. ISBN 978-0876688519. 
  18. ^ Wilcox, Richard Brian. 2004. The Ecology of Hope: Environmental Grassroots Activism in Japan. Ph.D. Dissertation, Union Institute & University, College of Graduate Studies. Page 55.
  19. ^ Jones, Adam (2006). "A Seminal Work on Genocide". Security Dialogue 37 (1): 143–144. doi:10.1177/0967010606064141. 
  20. ^ Santoni, Ronald E. (1987). "Genocide, Nuclear Omnicide, and Individual Responsibility". Social Science Record 24 (2): 38–41. 

Further reading[edit]