Human extinction

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In futures studies, human extinction is the hypothetical end of the human species.

In the near future, anthropogenic extinction scenarios have been proposed: global nuclear annihilation, dysgenics, overpopulation,[1] biological warfare or an accidental pandemic, ecological collapse, and global warming; in addition, emerging technologies could bring about new extinction scenarios, such as advanced artificial intelligence, biotechnology or self-replicating nanobots. The probability of human extinction within the next hundred years, due to human cause(s), is an active topic of debate.

In contrast, human extinction by wholly natural scenarios, such as meteor impact or large-scale volcanism, is extremely unlikely to occur in the near future. Humanity has survived natural existential risks for hundreds of thousands of years, therefore it would be an unlikely piece of bad luck for a sufficiently large natural catastrophe to occur in the next hundred.[2]

Moral importance of existential risk[edit]

"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress.[2] Many scholars make an argument based on the size of the "cosmic endowment" and state that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:

  • Philosopher Derek Parfit makes a straightforward utilitarian argument that, because all human lives have roughly equal intrinsic value no matter where in time or space they are born, the large number of lives potentially saved in the future should be multiplied by the percentage chance that an action will save them, yielding a large net benefit for even tiny reductions in existential risk.[3]
  • Philosopher Robert Adams rejects Parfit's "impersonal" views, but speaks instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society- more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."[4]
  • Philosopher Nick Bostrom argues that preference-satisfactionist, democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.[2]:23–4

Size of the "cosmic endowment"[edit]

Parfit argues that, if the Earth is habitable for a billion more years, and if it can sustainably support a population of more than a billion, then there is a potential for 1016 (or 10,000,000,000,000,000) human lives of normal duration.[3]:453–4 Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 1034 biological human life-years; and, if some humans were uploaded onto computers, could support the equivalent of at least 1054 cybernetic human life-years.[2]

Possible scenarios[edit]

Severe forms of known or recorded disasters[edit]

U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity" if left unchecked is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction.[7]

Habitat threats[edit]

  • In around 1 billion years from now, the Sun's brightness will increase as a result of a shortage of hydrogen and the heating of its outer layers will cause the Earth's oceans to evaporate, leaving only minor forms of life.[8] However, well before this, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains.[9] See Future of the Earth.
  • About 7–8 billion years from now, after the Sun has become a red giant, the Earth will probably be engulfed by an expanding Sun and destroyed.[10][11]

Population decline[edit]

Further information: Population decline
  • Preference for fewer children; if historical developed world demographics are extrapolated they suggest extinction before 3000 AD. (John A. Leslie estimates that if the reproduction rate drops to the German level the extinction date will be 2400.[1]) However, evolutionary biology suggests the demographic transition may reverse itself; conflicting evidence suggests birth rates may be rising in the 21st century in the developed world.[12] Whereas the work of Hans Rosling, a Swedish medical doctor, academic, statistician and public speaker predicts Global populations peaking at less than 12 billion.[13]

Scientific accidents[edit]

  • The creators of the first "superintelligent" entity could make a mistake and inadvertently give it goals that lead it to immediately "annihilate" the human race.[5][14]
  • In his book Our Final Hour, Sir Martin Rees claims that without the appropriate regulation, scientific advancement increases the risk of human extinction as a result of the effects or use of new technology.[15] Some scenarios:
    • Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).[15][16]
    • Creation of a "micro black hole" on Earth during the course of a scientific experiment, or other unlikely scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents. There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at a speed near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.[15][17][18]

Near-Earth object[edit]

Near-Earth objects (NEOs), serve as an absolute threat to the survival of living species, and that even small-scale events caused by one can result in a substantial amount of local and regional damages.[19] Because there are very few extraterrestrial impacts ever recorded in Earth’s history, there are no casualties in recorded history due to such impacts. However, a single, extraterrestrial event[20] can lead to the accumulation of more deaths and destruction than any man-made war or epidemic could ever produce. One mitigation technique includes the Kinetic impactor. Such a device is established on a momentum transfer, all caused by a durable spacecraft that is practically designed and sent out to crash onto the asteroid at high velocity (10 km/s), hoping that it will marginally change its orbit. During this process, what must be noted and paid attention to is that a second observer spacecraft is also present and vital in precisely calculating the resulting change in the asteroid’s orbit. In supplement to the Kinetic impactor, a second safety mechanism is commonly referred to as the Nuclear Blast Deflection technique. Hypothetically speaking, if one were to utilize a particular thermonuclear warhead for a nuclear detonation, NEOs of about 150–200 meters in diameter could be deflected by about 10 cm/s. In addition to the Kinetic impactor and Nuclear Blast Deflection techniques, a third possibility is famously regarded as the "gravity tractor." A gravity tractor is a man-made device placed in the vicinity of the NEO and seeks to alter the projected trajectory of the object.[21]

Scenarios of extraterrestrial origin[edit]


Main articles: Posthumanism and Transhumanism

Humans could evolve, through genetic engineering or technological modification, into a new species - posthumans.[25][26][27][28][29][30][31] Commentators such as Kevin Warwick and Ian Pearson point to the possibility of humans evolving by merging with technology.[32][33] Furthermore, "normal" biological evolution of humanity may continue so that Homo sapiens may gradually transition into one or more new species.

Reactions to human extinction risk[edit]

Probability estimates[edit]

Because human extinction is unprecedented, speculation about the probability of different scenarios is highly subjective. Astronomer Martin Rees gives humanity a 50-50 chance of extinction during the 21st century, and Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25%, and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure.[2][34] A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. The 2006 Stern Review for the UK Treasury assumes the 100-year probability of human extinction is 10% in its economic calculations.[34]

Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contigency plans and supplies for a long isolation.[35] In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war.[34] Any number of events could lead to a massive loss of human life; but if the last few, most resilient, humans are unlikely to also die off, then that particular human extinction scenario is not credible.[36]


Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:[2][37]

Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".

All past predictions of human extinction have proven to be false; to some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.[38]

Research and initiatives[edit]

Even though the importance and potential impact of research on existential risks is often highlighted relatively few research-efforts are being made in this field. In 2001 Bostrom stated:[39]

However, multiple organizations with the goal of helping prevent human extinction have been founded, such as the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute and the Global Catastrophic Risk Institute.

Perception of human extinction risk[edit]

It is possible to do something about dietary or motor-vehicle health threats. It is much harder to know how existential threats should be minimized.[3]

Some Behavioural finance scholars claim that recent evidence is given undue significance in risk analysis. Roughly speaking, "100 year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis-scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented.[4]

In 2010 Australian virologist Frank Fenner, notable for having a major role in the eradication of smallpox, predicted that the human race would be extinct in about a century.[40]

The threat of nuclear annihilation was a significant concern in the lives of many people from the 1950s through the 1980s.[41]

Observations about human extinction[edit]

David M. Raup and Jack Sepkoski claim there is a mysterious 26 million-year periodicity in elevated extinction rates.[42]

Milankovitch cycles are under the category of periodicity. These cycles describe the way that the earth moves and how the climatic changes vary depending on where the earth is in space. The orbital shape of the earth causes changes every 100,000 years. The axial tilt of the earth "wobbles" and alters the climate every 41 thousand years. The axial precession is the process in which the climate will change in terms of how the earth rotates about its own axis. Raup did find that every 26 million years, there will be a mass extinction. Roland Jansson and Mats Dynesius in their article The Fate of Clades in a World of Recurrent Climatic Change: Milankovitch Oscillations and Evolution,[43] discuss how these cycles will cause orbitally forced range dynamics (ORD) in which the climate changes induced by the Milankovitch cycles cause changes in the geographic distribution of clades. Clades are a group of organisms that are theorized to have evolved from a common ancestor.

Milankovitch cycles, or more commonly known as Quaternary Climatic Oscillations as described by Barker, P. A., et al. in Quaternary Climatic Instability in South-East Australia from a Multi-Proxy Speleothem Record,[44] effect the climate in various ways in either extreme cold or extreme heat.

Carl Sagan wrote:

If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss—including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise.[45]


Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare,[46][47][48] but it can also apply to extinction through means such as global anthropogenic ecological catastrophe.[49]

Omnicide can be considered a subcategory of genocide.[50] Using the concept in this way, one can argue, for example, that:

Proposed countermeasures[edit]

Scientists such as Stephen Hawking have proposed that an initiative to colonize other planets within the solar system could improve the chance of human survival from planet-wide events such as global thermonuclear war.[52][53]

More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster.[34][35] Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.[34][54]

In popular culture[edit]

Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[55][56] A threat of human extinction drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide.[57] Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.[58]

See also[edit]


^ Although existential risks are less manageable by individuals than health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life then we believe." Source: "Practical application" page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology

^ For research on this, see Psychological science volume 15 (2004): Decisions From Experience and the Effect of Rare Events in Risky Choice. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by Kahneman in "prospect theory" (in their original experiments the likelihood of rare events is overestimated). However, further analysis of the bias has shown that both forms occur: When judging from description people tend to overestimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader overestimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically underestimate its likelihood. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth." (Is Humanity Suicidal? The New York Times Magazine May 30, 1993).

^ says that Aum Supreme Truth is the only religion known to have planned Armageddon for non-believers. Their intention to unleash deadly viruses is covered in Our Final Hour, and by Aum watcher, Akihiko Misawa. The Gaia Liberation Front advocates (but is not known to have active plans for) total human genocide, see: GLF, A Modest Proposal. Leslie, 1996 says that Aum's collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.

^ Leslie (1996) discusses the survivorship bias (which he calls an "observational selection" effect on page 139) he says that the a priori certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe." (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, Acta Pysica Polonica B, May 1989).

^ For the "West Germany" extrapolation see: Leslie, 1996 (The End of the World) in the "War, Pollution, and disease" chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinction scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.

^ Former NASA consultant David Brin's lengthy rebuttal to SETI enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." (See full text at


  1. ^ Niall Firth (18 June 2010). "Human race 'will be extinct within 100 years', claims leading scientist". Daily Mail. Retrieved 28 January 2014. 
  2. ^ a b c d e f Bostrom, Nick. "Existential risk prevention as global priority". Global Policy 4.1 (2013): 15-31.
  3. ^ a b Parfit, D. (1984) Reasons and Persons. Oxford: Clarendon Press.
  4. ^ Adams, Robert Merrihew (October 1989). "Should Ethics be More Impersonal? a Critical Notice of Derek Parfit, Reasons and Persons". The Philosophical Review. 98 (4): 439. doi:10.2307/2185115. 
  5. ^ a b c Bostrom, Nick (March 2002). "Existential Risks". Journal of Evolution and Technology. 9. Retrieved 28 January 2014. 
  6. ^ Anders Sandberg; Milan M. Ćirković (September 9, 2008). "How can we reduce the risk of human extinction?". Bulletin of the Atomic Scientists. Retrieved January 28, 2014. 
  7. ^ Fiorill, Joe (July 29, 2005). "Top U.S. Disease Fighters Warn of New Engineered Pathogens but Call Bioweapons Doomsday Unlikely". Global Security Newswire. Retrieved 10 September 2013. 
  8. ^ Balzani, Vincenzo; Armaroli, Nicola (2010). Energy for a Sustainable World: From the Oil Age to a Sun-Powered Future. John Wiley & Sons. p. 181. ISBN 978-3-527-63361-6. 
  9. ^ Damian Carrington (February 21, 2000). "Date set for desert Earth". BBC News. Retrieved January 28, 2014. 
  10. ^ Clara Moskowitz (February 26, 2008). "Earth's Final Sunset Predicted". Retrieved January 28, 2014. 
  11. ^ Schröder, K. -P.; Connon Smith, R. (2008). "Distant future of the Sun and Earth revisited". Monthly Notices of the Royal Astronomical Society. 386: 155–163. arXiv:0801.4031free to read. Bibcode:2008MNRAS.386..155S. doi:10.1111/j.1365-2966.2008.13022.x. 
  12. ^ Can we be sure the world's population will stop rising?, BBC News, 13 October 2012
  13. ^ The best stats you've ever seen, Hans Rosling, TED Talks, Filmed Feb 2006
  14. ^ Chalmers, David (2010). "The singularity: A philosophical analysis" (PDF). Journal of Consciousness Studies. 17: 9–10. Retrieved 17 August 2013. 
  15. ^ a b c Martin Rees (2004). OUR FINAL HOUR: A Scientist's warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century — On Earth and Beyond. ISBN 0-465-06863-4
  16. ^ Bostrom 2002, section 4.8
  17. ^ Matthews, Robert (28 August 1999). "A black hole ate my planet". New Scientist. 
  18. ^ "Statement by the Executive Committee of the DPF on the Safety of Collisions at the Large Hadron Collider."
  19. ^ Perna . D; Barucci M.A; Fulchignoni .M (2013). "The Near-Earth Objects and Their Potential Threat To Our Planet". Astron Astrophys Rev. 
  20. ^ Alvarez, Luis W. (January 1983). "Experimental evidence that an asteroid impact led to the extinction of many species 65 million years ago" (PDF). Proc. Natl. Acad. Sci. U.S.A. 80: 627–42. doi:10.1073/pnas.80.2.627. PMC 393431free to read. PMID 16593274. 
  21. ^ Lu, Edward. T.; Love, Stanley G. (2005). "Gravitational Tractor For Towing Asteroids" (PDF). Nature. 438: 177–178. arXiv:astro-ph/0509595free to read. doi:10.1038/438177a. 
  22. ^ Bostrom 2002, section 4.10
  23. ^ Kluger, Jeffrey (21 December 2012). "The Super-Duper, Planet-Frying, Exploding Star That's Not Going to Hurt Us, So Please Stop Worrying About It". Time Magazine. Retrieved 20 December 2015. 
  24. ^ Tuthill, Peter. "WR 104: Technical Questions". Retrieved 20 December 2015. 
  25. ^ "EmTech: Get Ready for a New Human Species". Retrieved 1 July 2016. 
  26. ^ Thomas Aquinas : teacher of humanity : proceedings from the first conference of the Pontifical Academy of St. Thomas Aquinas held in the United States of America. ISBN 978-1443875547. Retrieved 1 July 2016. 
  27. ^ Perspectives on Health and Human Rights. Retrieved 1 July 2016. 
  28. ^ Miccoli, Anthony. Posthuman Suffering and the Technological Embrace. Retrieved 1 July 2016. 
  29. ^ "The Transhuman Future: Be More Than You Can Be". Retrieved 1 July 2016. 
  30. ^ "WILL YOU JOIN THE TRANSHUMAN EVOLUTION?". Retrieved 1 July 2016. 
  31. ^ "How humans are turning into a 'totally different species'". Retrieved 1 July 2016. 
  32. ^ Warwick, Kevin (2004). I, Cyborg. University of Illinois Press. ISBN 978-0-252-07215-4. 
  33. ^ Griffiths, Sarah. "Is technology causing us to 'evolve' into a new SPECIES? Expert believes super humans called Homo optimus will talk to machines and be 'digitally immortal' by 2050". DailyMail Online. Retrieved 1 July 2016. 
  34. ^ a b c d e Matheny, Jason G. "Reducing the risk of human extinction". Risk Analysis 27.5 (2007): 1335-1344.
  35. ^ a b Wells, Willard. Apocalypse when?. Praxis, 2009. ISBN 978-0387098364
  36. ^ Tonn, Bruce, and Donald MacGregor. "A singular chain of events". Futures 41.10 (2009): 706-714.
  37. ^ Yudkowsky, Eliezer. "Cognitive biases potentially affecting judgment of global risks". Global catastrophic risks 1 (2008): 86. p.114
  38. ^ "We're Underestimating the Risk of Human Extinction". The Atlantic. Retrieved 1 July 2016. 
  39. ^ Bostrom, Nick (2001). "Existential Risks - Analyzing Human Extinction Scenarios and Related Hazards" (PDF). 
  40. ^ Jones, Cheryl (16 June 2010). "Frank Fenner sees no hope for humans". The Australian. 
  41. ^ Ropeik, David. "The Rise of Nuclear Fear-How We Learned to Fear the Radiation". Scientific American. Retrieved 1 July 2016. 
  42. ^ David M. Raup (1992). Extinction: Bad Genes or Bad Luck. Norton. ISBN 978-0-393-30927-0. 
  43. ^ Jansson, Roland; Dynesius, Mats (2002). "The Fate of Clades in a World of Recurrent Climatic Change: Milankovitch Oscillations and Evolution". Annual Review of Ecology and Systematics. 33: 741–777. doi:10.1146/annurev.ecolsys.33.010802.150520. 
  44. ^ Barker, P. A. (2014). "Quaternary climatic instability in south-east Australia from a multi-proxy speleothem record". Journal of Quaternary Science. 29: 589–596. doi:10.1002/jqs.2734. 
  45. ^ Sagan, Carl (1983). "Nuclear war and climatic catastrophe: Some policy implications". Foreign Affairs. 62: 275. doi:10.2307/20041818. 
  46. ^ Rose Somerville; John Somerville, introduction (1981). Soviet Marxism and nuclear war : an international debate : from the proceedings of the special colloquium of the XVth World Congress of Philosophy. Greenwood Press. p. 151. ISBN 978-0-313-22531-4. 
  47. ^ Goodman, Lisl Marburg; Lee Ann Hoff (1990). Omnicide: The Nuclear Dilemma. New York: Praeger. ISBN 978-0-275-93298-5. 
  48. ^ Daniel Landes, ed. (1991). Confronting Omnicide: Jewish Reflections on Weapons of Mass Destruction. Jason Aronson, Inc. ISBN 978-0-87668-851-9. 
  49. ^ Wilcox, Richard Brian. 2004. The Ecology of Hope: Environmental Grassroots Activism in Japan. Ph.D. Dissertation, Union Institute & University, College of Graduate Studies. Page 55.
  50. ^ Jones, Adam (2006). "A Seminal Work on Genocide". Security Dialogue. 37 (1): 143–144. doi:10.1177/0967010606064141. 
  51. ^ Santoni, Ronald E. (1987). "Genocide, Nuclear Omnicide, and Individual Responsibility". Social Science Record. 24 (2): 38–41. 
  52. ^ Malik, Tariq. "Stephen Hawking: Humanity Must Colonize Space to Survive". Retrieved 1 July 2016. 
  53. ^ Shukman, David. "Hawking: Humans at risk of lethal 'own goal'". BBC. Retrieved 1 July 2016. 
  54. ^ Hanson, Robin. "Catastrophe, social collapse, and human extinction". Global catastrophic risks 1 (2008): 357.
  55. ^ "He imagines a world without people. But why?". The Boston Globe. 18 August 2007. Retrieved 20 July 2016. 
  56. ^ Tucker, Neely (8 March 2008). "Depopulation Boom". The Washington Post. Retrieved 20 July 2016. 
  57. ^ Barcella, Laura (2012). The end: 50 apocalyptic visions from pop culture that you should know about -- before it's too late. San Francisco, CA: Zest Books. ISBN 978-0982732250. 
  58. ^ Dinello, Daniel (2005). Technophobia!: science fiction visions of posthuman technology (1st ed.). Austin: University of Texas press. ISBN 978-0-292-70986-7. 

Further reading[edit]