Talk:Global catastrophic risks

From Wikipedia, the free encyclopedia
Jump to: navigation, search
          This article is of interest to the following WikiProjects:
WikiProject Astronomy (Rated C-class, Low-importance)
WikiProject icon Global catastrophic risks is within the scope of WikiProject Astronomy, which collaborates on articles related to Astronomy on Wikipedia.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 
WikiProject Death (Rated C-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Death, a collaborative effort to improve the coverage of Death on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
 
WikiProject Disaster management (Rated C-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Disaster management, a collaborative effort to improve the coverage of Disaster management on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
C-Class article C  This article has been rated as C-Class on the quality scale.
 Mid  This article has been rated as Mid-importance on the importance scale.
 
WikiProject Extinction (Rated C-class, Mid-importance)
WikiProject icon This article is a part of WikiProject Extinction, an attempt at creating a standardized, informative, comprehensive and easy-to-use resource on extinct animals, extinct plants and extinction in general. If you would like to participate, you can choose to edit this article, or visit the project page for more information.
C-Class article C  This article has been rated as C-Class on the quality scale.
 Mid  This article has been rated as Mid-importance on the importance scale.
 
WikiProject Environment (Rated C-class, Mid-importance)
WikiProject icon This environment-related article is part of the WikiProject Environment to improve Wikipedia's coverage of the environment. The aim is to write neutral and well-referenced articles on environment-related topics, as well as to ensure that environment articles are properly categorized.
Read Wikipedia:Contributing FAQ and leave any messages at the project talk page.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
 
WikiProject Geology (Rated C-class, Low-importance)
WikiProject icon Global catastrophic risks is part of WikiProject Geology, an attempt at creating a standardized, informative, comprehensive and easy-to-use geology resource. If you would like to participate, you can choose to edit this article, or visit the project page for more information.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 
WikiProject Science Fiction (Rated C-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Science Fiction, a collaborative effort to improve the coverage of science fiction on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
 


Contents

In a sentence[edit]

"The two greatest risks to Civilization, Humans, and the Planet Earth are Civilization and Humans"

How many thousands of times in how many thousands of years in how many thousands of ways has this been said?

Or at least civilization is the greatest risk to those others. Why are we lumping these three ideas together in one article? Wolfdog (talk) 17:12, 28 December 2013 (UTC)
There are potential risks stemming from our larger cosmos (e.g., asteroid or comet impacts, gamma ray bursts from supernovae, the aging and expansion of our star, galactic collisions... potentially even entropy that pose grave risks to humans and civilization), so I wouldn't necessarily agree with the statement that the two greatest risks to civilization humans, and Planet Earth are civilization and humans. — Preceding unsigned comment added by Annaproject (talkcontribs) 00:48, 7 March 2014 (UTC)

Near Tautology[edit]

How about shortening that to "the Greatest Risk is Risk"? Human Civilization per se does not pose a risk, unless we (wrongly) claim that every civilization that is not a globalized industrial forced-growth money focused civilization is none. In that case we could equally well argue that this "globalized industrial forced-growth money focused" monstrosity is not a civilization but something entirely different (see definition of Civilization). It is only this particular brand of human civilization (of which there were many)that poses existential risks not only to itself but to the Biosphere at large. The Khmer Civilization, the Inka, Mayans and countless other civilizations did not pose any risk to the planet.

Suggestions[edit]

  • Change "Planet Earth" to "Earth's Biosphere". The (rocky) planet itself is rather unimpressed by human activities.
  • Change "Risk to civilization" to "Risk to human livelihood" (Civilizations constantly change - that is not a risk but normal cultural evolution)
  • Change "Civlization" on the risk side into "Perpetual growth based industrial civilization"

Wassermensch 15:29, 12 April 2014 (UTC)

Hyperinflation[edit]

Hyperinflation and economic collapse should be included as a cause of and contributor to civilizational collapse. Mustang19 (talk) 02:54, 26 July 2013 (UTC)

Can you provide any citations from the peer-reviewed literature? Rolf H Nelson (talk) 17:41, 6 August 2013 (UTC)

Distance Conversions: How many signifacant digits?[edit]

Converting 1, 3, 10 and 140 kilometres into miles: Since the kilometre figures are accurate to only one or two significant digits (not five or six), I have so rounded the converted miles. - Glenn L.

End of the Earth[edit]

People say that the world is going to end on the 21st of December 2012


http://www.washington.edu/newsroom/news/2003archive/01-03archive/k011303a.html

(reply to unsigned comment) Hmm, we're still here! 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:17, 17 April 2014 (UTC)

Self collapse[edit]

this user did not use original research for this " Self-collapse - - Permanent settlement must always end in a collapse. The idea depends on infinite resources, which we simply do not have. There is a great dependancy on future generations or scientists to overcome this, however reliance on this may not be the best idea. Civilizations have crashed over mere intervals of only hundreds of years throughout history, and the next one may be defined by running out of oil, or even water. Civiliation has expanded for 10,000 years since the neolithic revolution, and has been dependant on the expansion of agriculture. When there is not enough water to support more agriculture, or the larger populations that inevitably follow, there could indeed be a crash. 70% of the fresh water available today goes toward agriculture, with 17-20% more expected by 2020. <http://news.bbc.co.uk/2/hi/science/nature/755497.stm>" but i will agree to have it removed until i add sources and wikfy it

-Ishmaelblues (—The preceding unsigned comment was added by Ishmaelblues (talkcontribs) .

What is the focus of this article?[edit]

According to the into: "The risks discussed in this article are at least Global and Terminal in intensity." So why is there a section on "Climate change and global warming" and another section on "Climate change and ecology"? First why are there two, and second those are neither global nor terminal. (Just because some people die does not make it "terminal" by the usage in this article, otherwise old age should be listed.) This entire article is littered with minor threats and needs a large trimming. Ariel. (talk) 09:12, 24 January 2011 (UTC)

Global warming has global/terminal potential. Green Cardamom (talk) 07:52, 7 February 2011 (UTC)
Global Warming, however, does not. The only actual extinction risks here are AI, Grey Goo, the cosmic risks and possibly Nuclear Winter. Civilization collapse is obviously easier (though most of these aren't even that). 82.11.1.60 (talk) 21:41, 7 May 2011 (UTC)

Proposed article split[edit]

I think this article should be split into natural and artificial risks (to civilization, etc.). I think that areas like the possibility of humans knocking a meteor out of a collision course with Earth counts as a natural risk.--Meximore (talk) 09:03, 28 March 2011 (UTC)

Good idea. 99.181.133.112 (talk) 21:32, 28 April 2011 (UTC)

It might be preferable to sort risks by probability. On a log scale the rough probabilities would sort according to their order of magnitude. The important risks would stand out.TeddyLiu (talk) 15:25, 14 April 2013 (UTC)TeddyLiu

Reorganize the articles?[edit]

I think this article try to put too many things together. It could get more manageable if was divided in:

  • Existential risk - risks to civilization/end of civilization (human maybe still existing but non in a civilized society - à la Mad Max - but one could argue that this too is a type of civilization. Ok, maybe we need a definition)
  • human extinction (no more human anywhere)
  • end/destruction of planet Earth (humans on another planet/space station or extinct) [article to be created?]

...and I'm tempted to use Doomsday event as the main article. Has anyone got other suggestions? --Dia^ (talk) 20:46, 17 August 2011 (UTC)


I strongly recommend the avoidance of the term "existential threat". This term is incorrect, and basically an error. If it is intended to mean "threat to existence" then that is what should be written. "Existential" means something entirely different.203.184.41.226 (talk) 02:17, 19 August 2012 (UTC)

@203.184.41.226 - yes, this phrase is correct. "Existential threat" means "threat to the existence of..." (it is often used in a specific form, e.g. "existential threat to America," or "existential threat to the Middle East," or "Existential threat to the world economy." If you are referring to "existentialism" (the philosophical topic) that is a [term of art] specific to the field of philosophy. The word "existential" means what colloquial rules of English suggests it would mean: related to existence. See, e.g., Merriam-Webster, or Cambridge University (which uses the similar term "existential risk." — Preceding unsigned comment added by 24.130.70.182 (talk) 04:42, 17 January 2014 (UTC)

The unsourced section removed by Special:Contributions/Arthur_Rubin (see below) can have elements added.[edit]

Escalating Societal Disparity

Right now we are witnessing a greater disparity than there has been for centuries. The ability for small financial elites to make themselves richer through access to expanding technologies is considerable. This danger has been well documented by Marshall Brain, Martin Ford, Jeremy Rifkin and Noam Chomsky. Noam Chomsky writes:

"In this possible terminal phase of human existence Democracy and Freedom are more than 'values to be treasured' - they may well be essential to survival"

The problem is not affluence - it is the ability of the few to acquire exclusive means of getting richer, or to consolidate this power, at the expensive of everyone else. A phenomenon very common in the third world is the very richest few percent of society buying all the agricultural land and using it to grow export crops, often being the root cause in these countries of widespread poverty and cyclical famines. It is very difficult to make this argument as it is constantly opposed to far right or pro market ideologues. Key technologies are likely to contribute to this escalation of wealth, namely 3D printing and nanoreplication, robotics, ubiquitous computing. The idea of a Singularity makes assumptions that are equally valid on what has become known as a 'disparity hockey stick'. While mass reproduced products benefit many, only a very small selection of humans has been enjoying the extreme riches that come with owning widespread automated infrastructure for production of goods or services. This point is emphasized by martin ford in the book the lights in the tunnel and has been touched upon by Jeremy Rifkin in the end of work. These two both argue strongly that in the next decade most paid labor will become automated or done by robots. This will end widespread societal access to jobs. That in itself creates a faster pace of irreversible unemployability or "jobless growth" that we will have ever seen before in human history. Already the middle class is evaporating in most western societies. In every society before in human history disparity and similar power asymmetry of the current epidemic has led to massive killings, or revolt. It is no surprise we have in fact been seeing some early types of revolts in the Arab world, and more recently in London.

In our currently prevailing macro-economic model there is no solution to such a hypothetical crisis, other than the masses realizing this would imply the cancellation of the social contract and engaging in open revolt against the state. The troubling question remains how easy it would be to force out a hyper-empowered elite, since they could use the same automated technologies to protect their property rights and interests in arguably ruthless ways.

99.35.12.88 (talk) 02:48, 27 August 2011 (UTC)

This was removed by Special:Contributions/Arthur_Rubin ... "rift between the poor and the wealthy widens." where Economic inequality wiklinks "between the poor and the wealthy widens." with only the comment "Inappropriate edits.". Why inappropriate edits, Art? 97.87.29.188 (talk) 19:26, 27 August 2011 (UTC)
Why inappropriate edits, indeed. Yours is a clear WP:EGG. Perhaps "[[economic inequality|rift between the poor and wealthy]] widens" might be appropriate, although I don't think so. What you have is just absurd. — Arthur Rubin (talk) 22:49, 27 August 2011 (UTC)
The current issue of Foreign Policy magazine might be useful Rich Country, Poor Country; The economic divide continues to expand by Joshua E. Keating (September/October 2011). 99.181.138.168 (talk) 06:12, 28 August 2011 (UTC)
I suppose we have to assume that is correct, even though editorial-like and contradicted by the real evidence. That doesn't support your particular Wikilinking, though; mine might be close to the truth as seen in that article. — Arthur Rubin (talk) 06:27, 28 August 2011 (UTC)
"The Truth"? Maybe see Wikipedia:Truth ... Special:Contributions/Arthur_Rubin? 99.181.139.210 (talk) 01:53, 29 August 2011 (UTC)

Clarify "Runaway greenhouse effect" with Runaway climate change (Earth habitability specific).[edit]

Clarify "Runaway greenhouse effect" with Runaway climate change (Earth habitability specific). 216.250.156.66 (talk) 18:38, 29 August 2011 (UTC)

Mercury vs Venus == lots of big chunks too close for confort?[edit]

If Mercury collides with Venus, isn't there the risk lots of significantly sized chunks will be flung out and stay close enough to hit Earth? --TiagoTiago (talk) 05:11, 10 November 2011 (UTC)

That is about as likely as the Earth falling into Sol. Orbits don't just change. Venus's orbit would have to magically decay at an incredible rate to somehow be near enough to Mercury to come remotely close. Mercury would probably become a moon, in that event. Much more likely is a moon-sized extra-solar-system object hitting one of the planets. And even that is pretty rare. In fact, the moon is suspected by some to be result of such a collision. Note that even 'mere' asteroids like the one that hit at the last mass extinction event ('XLE' or extinction level event) are rarer and rarer every day as the solar system ages. It's kind of like if you start with a drawer full of random utensils and keep picking spoons out. Eventually, it gets pretty hard to find a spoon. In other words, there's only a finite amount of asteroids above a certain size, and once they become part of a planet, they can't hit another. 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:12, 17 April 2014 (UTC)

Add Portal:Society and/or Portal:Environment[edit]

{{Portal box|Extinction|Society|Environment}} 99.181.156.221 (talk) 01:21, 18 November 2011 (UTC)

Why? Extinction seems reasonable, but not Society or Environment. — Arthur Rubin (talk) 03:29, 18 November 2011 (UTC)

Ocean acidification resource?[edit]

From Talk:Planetary_boundaries#resource.3F ... Acidifying oceans helped fuel mass extinction; Great die-off 250 million years ago could trace in part to waters' change in pH by Alexandra Witze October 8th, 2011; Vol.180 #8 (p. 10) Science News

99.181.134.6 (talk) 06:39, 18 November 2011 (UTC)

Identical sections of text in article[edit]

The following text, occurring both in section 2.1.3 Climate change and ecology(3rd paragraph) and section 2.1.5 Climate change and global warming is identical. One of them needs to be removed or at least rewritten:


Around 70 percent of disasters are now climate related – up from around 50 percent from two decades ago.[30] These disasters take a heavier human toll and come with a higher price tag.[29] In the last decade, 2.4 billion people were affected by climate related disasters, compared to 1.7 billion in the previous decade and the cost of responding to disasters has risen tenfold between 1992 and 2008.[30] Destructive sudden heavy rains, intense tropical storms, repeated flooding and droughts are likely to increase, as will the vulnerability of local communities in the absence of strong concerted action.

117.206.41.60 (talk) 11:32, 19 November 2011 (UTC)

Not about eschatology or eschatological questions[edit]

   I've removed the following language --

The concept is expressed in various phrases such as "End of the World", "Doomsday", "Ragnarök", "Judgment Day", "Armageddon", "the Apocalypse", "Yawm al-Qiyāmah" and others.

-- whose markup reads

The concept is expressed in various phrases such as "[[Eschatology|End of the World]]", "[[Doomsday event|Doomsday]]", "[[Ragnarök]]", "[[Last Judgment|Judgment Day]]", "[[Armageddon]]", "the [[Apocalypse]]", "[[Yawm al-Qiyāmah]]" and others. <!--[is it really pertinent to definite Future studies here?]The prediction of future events is known as [[futures studies]].-->

Actually (contrary to the piping of its first link) "End of the World" does not mean eschatology, tho that study does embrace many non-scientific approaches to the question

What will happen in the future, foreknowledge of which would put my mind at ease by making all this anxiety and/or torturous suffering in my own life seem worth enduring?

The actual relation between various end-of-the-world accounts and eschatology is this: Those who ask the eschatological question are more likely to be satisfied with answers that involve the end of the world, which is why the other actual 6 phrases (and likely the unspecified "others") tend to involve EOTW events.
   More to the point, that 2nd of the lead 'graph's two sentences has nothing to do with the entire remainder of the article (including the title and the lead sent), which is about big events entirely subject to scientific inquiry. How the eschatology sentence got into the article is a tantalizing question, but not worth pursuing IMO. It just needs to go out.
--Jerzyt 04:19, 20 November 2011 (UTC)

   BTW, note that the scope is accurately stated in the first titled section, "Types of risks": It does not extend its ambit to theories like Steady state universe, Big Crunch, Heat death, and coasting on the momentum of the Big Bang to arbitrarily large, arbitrarily rarified, and arbitrarily interactionless voids, each of which anyone who thinks there is value in eschatology should feel obliged to either address or explicitly deny the reality of; that failure is further evidence of the article's topic and eschatology passing only like two ships on a foggy night.
--Jerzyt 04:19, 20 November 2011 (UTC)

What is the probability that anyone of these fatal risks could happen?[edit]

If we consider flipping a penny or getting a 6 from a dye (dice?) 1/2 + 1/6 = 8/12 = 2/3. So the question is, how many events could lead to a fatal scenario? And what are there probabilities?

I think that is what this article is trying to get at. 24.25.237.226 (talk) 20:50, 5 December 2011 (UTC)

This is important, and poorly presented. We can ignore the eventual burn out of the sun if the Human species is most likely to go extinct before then. I think it would be valuable to assign a rough order of magnitude probability to each possibility so that that can be sorted by importance.TeddyLiu (talk) 15:29, 14 April 2013 (UTC)TeddyLiu

Human survival[edit]

This article, as well as Human extinction, present a rather gloomy view of the future. I'd like to suggest we contemplate putting together a complementary article called "Human survival" that summarizes information about potential technological solutions to the survival threats that humans face. It could cover the material in this article's "Precautions and prevention", as well as material from some other articles where there is coverage of the topic; namely: Space and survival, Geoengineering, and Planetary engineering. I've seen interest in this idea in other forums, so I think it can fly.

Here is one possible organization for the content:


Human survival in the future may depend on cultural adaptation to changes in the environment and the judicious use of technology to compensate for significant risks.

  1. Adapting to the environment
    1. Modified crops
    2. Genetic engineering of humans
    3. Alternative energy
    4. Renewable resources
  2. Adapting the environment to us
    1. Geoengineering
    2. Displacement of the Earth
    3. Space mining
    4. Modification of the Sun
  3. Changing to a new environment
    1. Space colonization
    2. Transhumanism
  4. Recovering from a massive die-off

What do you think? Regards, RJH (talk) 20:06, 22 June 2012 (UTC)

Proposal: Create a Portal for 'Existential Risk, Human Survival, and the Future of Complex Life in the Universe'[edit]

In response to RJH (talk) 20:06, 22 June 2012 (UTC):

You have an excellent idea there, and since the work being done on Existential Risk (see for instance the Cambridge Centre for the Study of Existential Risk prospectus - http://www.cser.org/) covers a wide range of crucial concepts, I believe it would make for an incredible Portal project.

I would propose, further, that the range of this Portal should include research on Existential Risk factors (sterilizing asteroids, etc), as well as the Fermi Paradox and the unresolved question of how widespread complex life is in the universe.  I have particular interest in sustainability and in forward-looking methods for safeguarding human survival, such as the building of Arcologies to minimize our footprint and provide stable havens from catastrophe, or archives for rebooting society.  And, should we survive, this theme merges into the interests of Transhumanism, as only by surviving the 'Great Filter' (if there is one) could humanity give rise to anything beyond itself.

(Apologies for not inline-linking these references; I'm on my iPad and it's a bit hard.  Will flesh out links from home.)

What might we need to do to proceed on such a project?

http://en.wikipedia.org/wiki/Fermi_paradox http://en.wikipedia.org/wiki/Arcology http://en.wikipedia.org/wiki/Transhumanism http://en.wikipedia.org/wiki/Great_filter

--Aqaraza (talk) 18:34, 16 July 2012 (UTC)

Thank you, Aqaraza. With regards to Portals, anybody can put one together per Wikipedia:Portal. You might try fleshing one out on a sandbox page in user space, then asking for feedback from some of the appropriate WikiProjects. Regards, RJH (talk) 21:26, 16 July 2012 (UTC)

Add book?[edit]

The Fate of Species: Why the Human Race May Cause Its Own Extinction and How We Can Stop It. by Fred Guterl the executive editor of Scientific American. Here is an op-ed by Guterl Searching for Clues to Calamity July 20, 2012 NYT. 108.195.138.171 (talk) 07:05, 22 July 2012 (UTC)

ignition of atmosphere nonsense[edit]

Hi guys, I have just found this statement in the Article:

"Experimental accident: Investigations in nuclear and high energy physics could conceivably create unusual conditions with catastrophic consequences. For example, scientists worried that the first nuclear test might ignite the atmosphere"

It is a well known fact, that the burning of nitrogen is endotherm... — Preceding unsigned comment added by 91.146.151.79 (talk) 11:28, 16 September 2012 (UTC)

If you're interpreting "ignition" to mean "oxidation," Teller's concern was not with oxidation but fusion of nitrogen, ostensibly an exothermic process when radiation losses are neglected. --Vaughan Pratt (talk) 11:27, 5 April 2014 (UTC)

"Existential"?[edit]

It has become common to refer to a "existential threat" to humanity. But that is a misuse of the word existential, surely. "A threat to human existence" is a very different thing.203.184.41.226 (talk) —Preceding undated comment added 00:27, 30 September 2012 (UTC)

"Existential" can mean "...relating to existence..." (Webster). So "Existential risk" and "existential threat" are semantically correct and also (to me) clear (obviously it means a threat to someone or something's existence), and anyway seems to be a prominent term in the peer-reviewed literature. The exact phrase "existential threat" also pops up in geopolitics, usually in the context of "existential threats to Israel".Rolf H Nelson (talk) 21:12, 15 June 2013 (UTC)

Article Name[edit]

Ok, I know this doesn't affect the quality, but does anyone else find the title a little bit cheesy? Surely, couldn't one think of one that didn't sound like a sci-fi novel? Like, I don't know, "List of potential dangers that could lead to mass death or annihilation"? All right. That title's not that great. But can somebody please think of something? It's kind of bugging me. 101.165.7.81 (talk) 06:16, 6 December 2012 (UTC)

I agree. How about "Survivability of humankind". I don't know how to edit titles, or I would. Anybody?
Will9194 (talk) 04:56, 8 December 2012 (UTC)
To change the title, the page must be moved. Detailed instructions can be found here. Mysterious Whisper (SHOUT) 14:52, 8 December 2012 (UTC)

I disagree the title should be changed. The title was discussed years and ago (see talk archives) and it was very difficult to arrive at something people can live with. Remember that the title needs to reflect what is contained in the article. The suggestions above do not reflect the contents of this article. They reflect "human extinction" which is only one possibility. Other possibilities include the end of civilization but humans remain, or the entire planet is destroyed and no life remains. There is no easy short way to put it unfortunately, it has to be literally spelled out. -- Green Cardamom (talk) 17:33, 8 December 2012 (UTC)

I suspected as much, which is why I simply provided the guide for controversial moves. Consensus can change (particularly from "years ago"), especially after a major rewrite (as is being proposed below), although I too think the current title is more appropriate than either suggested. Mysterious Whisper (SHOUT) 17:58, 8 December 2012 (UTC)
Here are the previous titles. The current literal title has had the fewest complaints over time (no title has been complaint-free). I personally think 'Existensial risk' is the most academic and appropriate, if we were to rename, though it doesn't make it as clear as the current literal title. Concerned about the 'major rewrite' for same reasons you raised. -- Green Cardamom (talk) 18:34, 8 December 2012 (UTC)

I think "humans" should be replaced with "humanity". Jruderman (talk) 05:04, 1 January 2013 (UTC)

Theories that don't believe in global warming[edit]

You know, not everyone believes in the whole "end of the world due to man made global warming" thing- and plenty of them have good evidence. Shouldn't someone point that out, or at least acknowledge that some people don't think that it will happen? — Preceding unsigned comment added by 101.165.7.81 (talk) 06:29, 6 December 2012 (UTC)

Proposed major revision[edit]

I'm composing a major revision to this article. Would like your comments before posting it.

  • Article correctly states that man-made (and man-exacerbated) risks greatly exceed natural risks. However, scenarios for natural hazards get sub-sub-sections while the important man-made hazards are relegated to sub-sub-sub-sections. Man-made and natural hazards should be two section headings to emphasize that the former is serious while the latter is light reading. This is in accord with Meximore's suggestion.
  • I shall include a summary of the book Apocalypse When? (Springer, 2009) by Wells. Of the 5 or 6 books on this topic from mainstream publishers this is the only one that provides a top-down mathematical formula for a best estimate of survivability.
  • Let's omit, "Cambridge identified the "four greatest threats" to the human species: artificial intelligence, climate change, nuclear war and biotechnology." I think irresponsible geoengineering is a greater threat than nuclear war because the latter would be confined to the Northern Hemisphere where the nuclear powers are. Tierra del Fuego and South Island, NZ are isolated by two atmospheric Hadley cells. Already a rogue geoengineer, Russ George, illegally dumped 110 tons of iron into the Pacific. Another serious threat is 'miscellaneous', the hundreds of tiny risks that become serious in aggregate.
  • Article twice recites the names of rivers in southern Asia. Once is enough, or maybe zero.
  • I would remove Wiki links to ordinary words: food, arctic, carbon dioxide, Africa, California.
  • Add examples of hazards that have already happened: rogue astronomers sent signals to potentially hostile exoplanets, mousepox experiment, Stuxnet.
  • Add the Lifeboat Foundation and the think tank Global Catastrophic Risk Institute to the list of concerned organizations.

Will9194 (talk) 07:06, 8 December 2012 (UTC)

So, you want it to look something like this. As was suggested, the content is largely OK, but the tone is a bit off. Check out the Manual of Style, specifically WP:TONE, WP:NPOV, & MOS:LEAD. More references wouldn't hurt, although what's there is probably adequate. (I haven't been able to peruse the entire revision (yet), but this is what I've gathered from the first third or so.) Perhaps more pressingly, did you just state that you are the author of Apocalypse When?? Mysterious Whisper (SHOUT) 14:52, 8 December 2012 (UTC)

Will9194, there is a serious WP:COI (Conflict of Interest) and maybe should not be editing this article at all. You attempted to re-write this article in a way that is favorable to the POV of your book and ideas and concepts. That's not how Wikipedia works (even if it currently did favor the POV of Bostrom, but it was not written by Bostrom and it was done years ago when Bostrom was the main reliable source available). You may be an expert on this topic, but by including your book and ideas into this article you crossed the line and there has been a breach of trust. I have no problem including the POV's of Gott, Rees, Leslie, Wells, Posner, Joy and anyone else so long as it is handled neutrally as Mysterious Whisper said. The best way to do it just just create sub-sections for each author and their POV, for example see the theories of the Decline of the Roman Empire - each author and theory has its own sub-section and summary. --Green Cardamom (talk) 18:15, 8 December 2012 (UTC)

Well, I was going to wait for more information... but, yes, what Green Cardamom said. We take conflicts-of-interest very seriously because it is almost impossible to write from a neutral point of view while influenced by one. Which isn't to say you're barred from contributing; you're still free to edit this and other articles, but I'd recommend you create your sandbox (go here), and draft any suggested changes (to articles that deal with your book or related topics) there.
As for Green Cardamom's suggested layout, I don't think dividing all the topics by author would work well here, however, if the missing viewpoint(s) can be summarized into a dedicated section(s), that could then be added to what's already there, that would probably be the best way to include them (for now). It would help greatly if there is evidence of other published authors sharing the same views. Mysterious Whisper (SHOUT) 19:01, 8 December 2012 (UTC)
Green Cardamom & Mysterious Whisper,
It never occurred to me that my edit might be regarded as a conflict of interest, because the mainstream book authors who estimated survivability, namely Rees, Leslie, Wells, get very similar results and so does the median opinion at the Global Catastrophic Risk Conference. (You might go to my original submission of 30 November and search for the words 'eerily' and 'eerie'.) So what we have is not conflict but closure. Maybe synergy is a better word for the following reason: Each of the other authors does a bottom-up estimate in which he studies individual hazards until his intuition settles on some numerical measure of survivability. By contrast I used a top-down approach based on a general principle that transcends the list of individual hazards. Thus, our agreement is synergistic, hence all the more important to present to the public.
Wikipedia strives for articles that are verifiable. You can slog thru Apocalypse When? and verify it mathematically step by step. Well, almost; some assumptions approximate the real world but don't match it perfectly. Alternatively, you can go to the book's website and find a digest.
Finally, I daresay that many of Wikipedia's edits and articles describe the author's own work. They just have the good sense to keep quiet about it. I suspect that these are among the best articles because these authors have incentive to hold the reader's interest. I agree that they are not neutral but suspect that other editors soon neutralize them. I've already had a small experience like that. On 1 May 2010 I edited the satellite de-spin mechanism to credit myself and two others with the original invention. I was immodest by displaying our names prominently in the main text. Two hours later LouScheffer neutralized it. Overall, it seems to me that Wikipedia is a wonderful system of checks and balances if you just turn it loose and let it run its course.
Will9194 (talk) 06:12, 10 December 2012 (UTC)
This does seem to happen often, though the guideline on it is dreadfully short.
I appreciate the disclosure, but that one of your few other edits was also self promotion, doesn't help much.
It is immaterial whether persons of interest have edited other articles to their own benefit without being caught. If you had made several smaller revisions, used a more neutral tone, and focused less on your book, nobody would have cared if you were the author of one of the sources, as the result would have been inherently acceptable.
Whether or not the material in your book is definitively and demonstrably true also matters not. WP:Verifiability, used in this context, means simply that it must have been published previously by a reliable source. Hence, there is a difference between "verifiable" and "true". (Wikipedia concerns itself primarily with the former.)
Filling the article with 'connections', eerie or otherwise, between your and others works, unless others have made and written about said connections, in unnecessary WP:Synthesis.
As for "let it run its course", that's what's happening now. You made a big edit with enough problems that the most efficient way to "neutralize" it was to undo it, at least temporarily. This reversion is followed by detailed discussion. It's a well-defined cycle. Your smaller edit was simply fixed because it was so simple to fix.
As I said, you are free to continue editing the article, although your contributions on this topic will likely be given special scrutiny. An alternate method is to draft edits in your sandbox (here is the bare code of your 30 November revision), although this will be tricky as you are trying to rewrite the bulk of the article. It would, however, allow you to continuously improve while soliciting community input. What you don't want to do is change the entire article, all in one edit, with little-to-no discussion; such might be reverted even when flawless.
Also note that Wikipedia aims to summarize significant opinions with representation in proportion to their prominence. Which is to say, the more you talk about your theories in the article, the more evidence there should be of other, independent sources talking about your theories. Mysterious Whisper (SHOUT) 16:10, 10 December 2012 (UTC)
Mysterious Whisper, Premium Mobile, & Green Cardamom,
Two issues about my proposed major edit; see bullet list above in my post of 8 Dec. Except for the second item, the only criticisms were matters of style. I've studied the rules and believe I can implement these corrections easily. Now that 9 days have elapsed, would this be a good time to go ahead? MW, you suggested I use my sandbox. Is that because you have access to it and can make critiques? Or is my sandbox just my private place to make rough drafts?
The major issue pertains to Item 2 since I am the author of the book to be summarized. (Don't buy the book; you can find a detailed digest here.) MW, you wrote, "The more you talk about your theories in the article, the more evidence there should be of other, independent sources talking about your theories." Unfortunately, I’m only aware of a couple: some fans in the physics department at U. Alaska, Fairbanks, and a lovely PowerPoint presentation from U. of Western Ontario. The latter is no longer online, but I can attach it to an email. These citations may not be adequate, in which case I’d like to apply for a waiver for the following reasons:
  • Highly favorable book reviews, summary and links at the book's website.
  • Springer's reputation as a world-class science publisher.
  • Supreme importance of the topic, human survivability. The public should know that humanity's risk is greater than most people would suppose.
  • Absence of any other mathematical treatment.
  • My own qualifications in applied math; go here and find my name (W.H. Wells) in red in the sidebar on the right.
  • Mathematical results substantiate educated guesses by important authorities: Martin Rees, John Leslie, Stephen Hawking, Bill Joy, and attendees at the Global Catastrophic Risk conference, Oxford U.
Will9194 (talk) 22:34, 17 December 2012 (UTC)
I'm sorry, Will. I don't know enough about the rules concerning COI to give you an informed answer. Like I said before, I only objected to the writing style being too narrative. I didn't have a problem with the substance of the edits. Primium mobile (talk) 15:41, 21 December 2012 (UTC)
────────────────────────────────────────────────────────────────────────────────────────────────────
If everything else (tone, weight, POV, etc) is perfect, we can ignore the COI.
If few others have talked about your presentation of your theories, and there are no other similar theories ("Absence of any other mathematical treatment"), it sounds a lot like a fringe theory. Any mention of such must be roughly in proportion to the number of reliable sources mentioning it (thus, quite brief). I'm afraid we don't make exceptions to this for any of your bulleted points. And again, no matter how well your results agree with those of others, you can't fill the article with such connections unless others have explicitly made them.
Wikipedia:Best practices for editors with conflicts of interest recommends that you not make any major edits to relevant articles; "Instead, make suggestions on article talk pages and let others decide whether to implement them", hence why I suggested you draft large changes in your sandbox first. But if you've read, understand, and can implement WP:NPOV, WP:V, WP:WEIGHT, WP:SYNTH, WP:TONE, & WP:LEAD, as well as the points made above, there's nothing stopping you from editing the article. At the very least, I'd still make several smaller changes instead of one major revision (if only to better facilitate discussion of individual points). Mysterious Whisper (SHOUT) 17:02, 21 December 2012 (UTC)

Dust[edit]

68.188.203.251 (talk) 14:15, 23 December 2012 (UTC) Please include dust. During this last pass through the galactic equator the heliosphere was seemingly open to cosmic dust. Electrostatic builds up on dust here on Earth so the cosmic dust build up could possibly be the source of catastrophic stripping of E electricity and hence magnetism. The E could develop such a charge that space currents could be drawn to E and discharge. This appears a probable occurrence in the past for example when there was claimed to be no Moon. k sisco

Sources? Mysterious Whisper (SHOUT) 15:47, 23 December 2012 (UTC)

footnote 104 does not seem to support the claim brought forward o.O The article says 'The solar system passing through a cosmic dust cloud, leading to a severe global climate change', the article used as a ressource only writes about a recent increase in dust particles, but deosnt even mention the climate. 2001:630:12:242C:88BD:5444:49F1:DDA2 (talk) 23:09, 28 February 2013 (UTC) Mo

The Stars, Like Dust? — Arthur Rubin (talk) 00:49, 1 March 2013 (UTC)

Article Defaced[edit]

Under the heading of Chances of an Existential Catastrophe, the opening line appears to have been defaced: it reads "Some doo dars assasinated by flying pigs such as that from asteroid impact, with a one-in-a-million chance of causing humankind extinction in the next century,"

I would correct it but do not know enough about editing Wikipedia articles to correct them, and what the correct wording of this sentence was to begin with. 76.0.14.23 (talk) 11:11, 10 April 2013 (UTC)

Thank you for bringing this to attention. The passage in question was vandalized by 176.35.156.199 (talk · contribs) on 10:14, 10 April 2013‎, and restored by Wannabemodel (talk · contribs) on 11:24, 10 April 2013‎. - Mysterious Whisper 14:04, 10 April 2013 (UTC)


Published Research[edit]

I've been reading the literature on human survival and notice that some important research is not reported in this article. Somewhere (I'd prefer early on) we need a comparative discussion of the big picture, not just the identified risks discussed one at a time as in Section 4. Some of this is already included, Note 9, which refers to a conference at FHI, and Note 23, Richard Posnerís book. However, we need to include (listed here in chronological order) the following Ö

  • Gott, Implications of the Copernican Ö Nature 363 p.315, 1993
  • Leslie, The End of the World, Routledge, 1996
  • Posner, Catastrophe, Oxford, 2004
  • Rees, Our Final Century, Arrow Books, 2004
  • FHI conference, Oxford, 2008
  • Bostrom & Cirkovic, editors, Global Catastrophic Risks, 24 chapter authors, 2008
  • Wells, Apocalypse When?, Springer, 2009
  • Guterl, The Fate of the Species, Bloomsbury, 2012
  • Casti, X-Events (Extreme Events), William Morrow, 2012

Any more we should add?

Some comparisons: Unlike the others Judge Posner puts strong emphasis on cost-benefit analysis. Each threat has some probability of happening and some number of casualties if it does. The product of the two is called the expected number of casualties. So divide the cost of prevention by the expected number to get the cost/benefit ratio. Posner would allocate resources where this ratio is least. He advocates expensive public programs to reduce the risk to humankind. Reviewers of this book are skeptical whether the public will bear the expense for benefits not apparent during their own lifetime. [link to Amazonís reviews]

There are two ways to analyze human survival, top-down and bottom-up. Bottom-up begins with a list of threats and then synthesizes the resultant overall risk. Top-down finds some principle that transcends individual threats, thus avoiding the need to make a complete list of them.

Gottís top-down analysis begins with an observer who determines the age A of some entity (Homo sapiens in our case). If there is nothing special about the time of observation, a reasonable best estimate would assume that the observer arrives at a random time within the life of the entity. In this case Gott shows that the probability the entity will be alive at future F is

Prob(F|A) = A/(A+F).

This formula gives 1.0 at the time of observation (F=0) and then decays to zero in the infinite future. It works for many entities such as stage plays. But then Gott applied his formula to mankind, A=2000 centuries, just as though there were nothing special about the present. However, the other scholars on the list above think our time is very special because dangerous new technology is proliferating at an increasing rate. This would invalidate Gottís assumption and put us not at a random fraction of humanityís lifetime, but much closer to the end.

Wells developed arguments that alter Gottís formula and show that it will apply to humanity if we replace calendar time with some measure of cumulative risk exposure. Wells then estimated risk exposure based on measures of dangerously rapid "progress", especially statistics of U.S. patents issued and the number of papers published in science and engineering. These are reasonably consistent with other indicators, such as the number of pages published in Nature magazine and gross world product, (pp. 77 & 78). Some issues of statistical weight degrade Wellsí accuracy considerably, maybe a factor of two, but ultimately he derives a best estimate for survival probability expressed as a mathematical formula.

Guterlís book is strictly bottom-up and confined to orthodox threats, the ones nearly everybody recognizes: superviruses, big natural events (asteroid strikes, volcanoes, and the like), climate change, ecosystem degradation, synthetic biology, machines and artificial intelligence. He approaches these subjects like a journalist describing pertinent real-life events and interviews with people involved. Wellsí approach could not be more different. He interviews nobody, but he does stress the importance of many small inconspicuous threats, which in aggregate may comprise a big threat, perhaps as big as some of the orthodox ones. He gives a couple of examples. This inability to list all significant threats undermines the bottom-up approach and conveniently highlights Wellsí own top-down approach.

Four of the references listed above give numerical probabilities of survival. Gottís is the outlier by far for the reason explained above: 97.5% chance of survival for 51 centuries. Rees estimates 50% probability that civilization will suffer a major setback during the next 100 years, which might kill billions. Leslie thinks the probability of extinction is about 30% after 5 centuries. Since extinction is much less probable than civilizationís collapse, these two estimates agree fairly well. A poll of experts at the Oxford FHI conference asked the chance of extinction during the next 100 years. Their median answer was 19%, again in good agreement. These three estimates are bottom-up. The participants studied individual threats and then decided on numbers that seemed intuitively about right.

By contrast, Wells used his formula and found the current risk rate for civilizationís collapse to be about 9% per decade and the half-life of civilization to be about 9 billion people centuries. (Human life expressed in population-time is analogous to labor to do a job expressed in man-hours.) If the average world population over that period is 9 billion people, then the half-life of civilization is 1 century in agreement with Rees. However, Wells evaluated some parameters from empirical data, which gave him some wiggle room to adjust his answer. Moreover, his formulation contains simplifying assumptions that do not exactly match physical reality. Thus his answers could be off by a factor of two. Still, this can be regarded as decent agreement for a quantity as slippery as human survival.

TeddyLiu (talk) 22:30, 15 April 2013 (UTC)

We can report on these things. If you are familiar with this field, perhaps you could summarize the major writers, works and theories? See for example Decline of the Roman Empire#Theories of a fall, decline, transition and continuity for how it is done in a field with many competing POVs and no clear correct answer. -- Green Cardamom (talk) 23:48, 15 April 2013 (UTC)

Looking at your edit today, some suggestions:

Rolf H Nelson (talk) 04:52, 19 June 2013 (UTC)

→Reverted the edit until it can get more discussion. Also, TeddyLiu, are you Wells, or someone with a connection to Wells? If so you should disclose that. Rolf H Nelson (talk) 05:15, 19 June 2013 (UTC)

I agree with Rolf H Nelson on all points, including the similarity to and POV of Willard Wells, who was here a few months ago as Will9194 (talk · contribs), proposing similar changes. As Rolf said, if you have any connection to Wells, you should disclose it immediately.
Additional notes on your proposed addition:
  • Lose the "as of May 2013"; don't ask the reader questions; avoid "More discussion below" and similar (the text "below" may later change); don't link section titles (as you did with many authors' names); in general, see WP:Manual of Style
  • We could work with summaries by author, but they should be shorter, more closely grouped by concept, and more focused on the concepts, rather than the authors themselves.
  • Have other major publications categorized the relevant literature as either "Descriptive" or "Analytic"? If not, we should probably avoid doing so.
  • Ensure the works you cite are reliable sources. as Rolf indicated, only about half of the "references" you provided are reliable sources, and even then, some are being used inappropriately.
  • At over 70kB, this is already a long article. Your addition brings it to nearly 100kB. The rule of thumb is that at these lengths, we should be more concerned with splitting the article apart, rather than adding to it. Make any additions as concise as possible.
  • There are not nearly enough references, which indicates that large portions of your text is original research. Further, the style and tone suggests original synthesis. We cannot accept either. I even see direct quotations without attribution. Be especially wary of making "connections" between (or even comparing) unrelated works (which have not already been made in reliable sources). We do not deal in the "ironic", "coincidental", etc.
You're clearly well read on this topic, and I'm sure this page could benefit from your work, but the addition you've presented is unacceptable. Please look through our core content policies and manual of style, and continue editing it in your sandbox. Mysterious Whisper 12:48, 19 June 2013 (UTC)


Re: Organizing on person vs idea, the suggested model of Decline of the Roman Empire#Theories of a fall, decline, transition and continuity is mostly organized on idea first, person second. However in this case, what we have here is each author presenting multiple original ideas. It's difficult to imagine how to provide a full survey of the various POV's of this topic without breaking it down along author lines. -- Green Cardamom (talk) 15:03, 19 June 2013 (UTC)

→Cost-benefit ratios are discussed both by Posner and by Bostrom (somewhere). Space habitats are advocated by Hawking and Rees. Curtailing civil liberties (not exactly a crowd-pleaser) is apparently advocated by Posner and Rees, and opposed by Casti. If Casti is the only one who talks about complexity, then his overall thesis might be fringe, but we can still quote him briefly to provide an alternate POV on the civil liberties thing. So I guess I'm opposed to a literature review per se, but am in favor of integrating all the literature found into the article. And again, I'm envisioning that the notable books can get their own brief pages. But, that's just my opinion; and if we get to a good draft doing it author-first I'll change my mind. Rolf H Nelson (talk) 02:22, 20 June 2013 (UTC)

I addressed the above on http://en.wikipedia.org/wiki/User:TeddyLiu/sandbox. Please make further comments there.TeddyLiu (talk) 02:29, 4 July 2013 (UTC)

Organizations studying existential risk[edit]

I've been looking at the list of organizations in Section 6. I followed the first link for U. Cambridge and found it merely a place holder for a proposed research centre, CSER. I first heard of it in July 2012, and I'm underwhelmed by their pace of progress. Suggest we delete them from the list of organizations until they get underway and have progress to report.

However, the site does have a nice quote by Prof. Huw Price: "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology. When this happens we're no longer the smartest things around and will risk being at the mercy of machines that are not malicious, but machines whose interests don't include us." We could add that to the Quotations section following the one by Nick Bostrom.

I'd much rather see the Lifeboat Foundation listed among the organizations. [1] They have thousands of members and a scientific advisory board that includes every relevant specialty. Seems like the public should be aware of them.

GGGudex (talk) 00:31, 26 April 2013 (UTC)

As long as they have an article, I have no problem mentioning them there (In fact, I just added a mention). However, I see that said article is in jeopardy due to WP:NOTABILITY concerns. If the article gets deleted, we can still mention them here, although in that case I'd like to see a few good references. (Although, a few good references would also keep that article from getting deleted...) Also, while I'll assume good faith now, note that knowing the founder (as you stated at Talk:Lifeboat Foundation) presents a conflict of interest, and you should avoid appearing to advertise for them to prevent any misunderstanding. (Coming to the talk page, instead of just adding the information, was the best thing you could have done, and for that I thank you.) Mysterious Whisper 01:08, 26 April 2013 (UTC)
I did a Google search for the proposed Cambridge group, Centre for the Study of Existential Risk, and found myriad citations that meet your criteria, even though they are not underway yet. To me this suggests that the number of references is a poor way to evaluate "notability". It favors the elite and those who are already entrenched or well connected.
Incidentally, Jaan Tallinn, co-founder of CSER, jaan.tallinn@gmail.com, is a member of the advisory board of Lifeboat Foundation.
GGGudex (talk) 01:45, 4 May 2013 (UTC)
If you don't like the notability guideline, here is not the place to try to change it. In any event, "The notability guideline does not determine the content of articles, but only whether the topic should have its own article." I mentioned it only because 'whether the topic [has] its own article' is part of what appears to be the established basis for inclusion in that list (notice that every other entry has a main article), to prevent it from becoming an indiscriminate collection of information.
As I said, I don't necessarily mind mentioning them here even if the main article gets deleted, but in that case I'd like to see some references. Since my initial comment here, I've twice attempted a search for references, and the extreme lack of WP:RS coverage is astounding.
I currently support the list as it stands (with both the Centre for the Study of Existential Risk and the Lifeboat Foundation mentioned), although I'm becoming more disillusioned with the Lifeboat Foundation the more I look into it.
Mysterious Whisper 03:01, 4 May 2013 (UTC)

Proposal: change this article focus to 'global catastrophic risks', move 'human extinction' risks to 'human extinction' article[edit]

"Risks to civilization" is a more vague and much more broad category than "human extinction", and human extinction already has more than enough material to fill its own article. Currently there's significant overlap between the two articles.

Also, I'm confused as to what this article is about. The title includes 'Risks to civilization', but the header says it's about "existential risk". So is it about:

1. Human existential risk, as in the straightforward (and usual?) meaning of 'risks that the human race will no longer exist'?

2. Existential risk as used (coined?) by Bostrom 2002, who includes (idiosyncratic?) concerns about civilization being "permanently crippled" in the definition?

3. Risks to civilization, in the sense of global catastrophic risks that could kill billions but would not *directly* lead to human extinction or permanent crippling of the human race?

My proposal is that we:

1. Rename this article to 'Global catastrophic risks' or 'Risks to civilization'

2. Move all 'Human extinction' risks, except perhaps a brief summary, into the human extinction article, to reduce duplication.

Thoughts? Rolf H Nelson (talk)

This article is about risks to civilization and/or humanity and/or planet Earth. It covers full scale of possibility. We need a master top level article for disaster scenarios. If the article becomes too long split along logical lines into sub-articles (by way of subsections with "main article" links). However if you read this article, the sub-section on humanity is about risks from humanity, the cause of the disaster, not the scope of the disaster. Big difference. We don't try to slice and dice a scenario as only effecting humanity, or civilization, or Earth - because we can't know the scope of many disasters, never happened before. And each scenario could impact on multiple scales. That is the fundamental problem with the existence of human extinction, it tries too hard to be only about one thing and then runs into a problem with its sources, which are more inclusive. BTW that article was created basically as a fork of this one not long after this one was created. Its editors were never cooperative with a merger and they never really developed the sourcing for it. IMO it should be merged into this one, then we can discuss ways to create sub-articles, if needed. -- Green Cardamom (talk) 06:18, 2 July 2013 (UTC)
It sounds like one thing we're in agreement with, then, is that the parts of this page that say "this article is about risks that are global and terminal" should be removed, as the scope of this article is broader than that. If there's no objections, I'll take them out then if nobody else beats me to it. Any thoughts on the article title? It seems to me that "risks to civilizations, humans, and planet Earth" is redundant; isn't a risk to humanity also a risk to civilization? Rolf H Nelson (talk)

Heat death[edit]

Just edited the bit about intelligence surviving the heat death. Right now, the consensus is that it can't, but there are prominent physicists out there who think otherwise. — Preceding unsigned comment added by 82.22.36.36 (talk) 06:35, 28 June 2013 (UTC)

I believe the Omega Point hypothesis is non-peer-reviewed WP:Fringe, and as a bonus has been discredited in the peer-reviewed literature. Rolf H Nelson (talk) 01:24, 1 July 2013 (UTC)

What I added has nothing to do with the Omega Point? — Preceding unsigned comment added by 82.22.36.36 (talk) 22:46, 1 July 2013 (UTC)

You're right, my bad. The cited URL www.aleph.se/Trans/Global/Omega/dyson.txt and the content confused me into thinking it was about Tipler's Omega Point and Final Anthropic Principle. Sorry about that. Let me start over then:
Dyson's 1979 paper is legitimate and isn't pseudo-science, but wouldn't you agree it has since been completely invalidated by the other paper you cited (http://adsabs.harvard.edu/abs/2000ApJ...531...22K), and by the discovery of the cosmological constant? (As an aside, I am not a physicist, but the argument in the second paper that "Eventually, the probability of a catastrophic failure induced by quantum mechanical fluctuations resulting in a loss of consciousness becomes important" seems definitive.) The paper concludes that "we find that eternal sentient material life is implausible in any universe"; is there evidence that this is still an open issue in physics post-1999? If not, I don't see a strong reason to reference the Dyson paper in this article. But, that's just my opinion.Rolf H Nelson (talk) 02:53, 2 July 2013 (UTC)

Yeah, you're right. However, I think it's important to at least mention the other proposed possibilities of life surviving in an expanding universe... as I said, trying to "program" or find a ready-made wormhole that is connected to another Universe is something that an advanced civilization shouldn't have terrible difficulty doing. But it's speculation.

Can you clarify? What other notable possibilities currently exist besides a wormhole, given 2000ApJ? Also, the physics that would be required to escape into baby universes could be expanded on, presumably in Ultimate_fate_of_the_universe#Life_in_a_mortal_universe, which is currently short on citations. It would also be good to have, somewhere on wikipedia, information on what physics is required for traversable wormholes in general, since it comes up often in pop science writings about science fiction. I assume you have to postulate multiple additions to the current (non-quantum-gravity) accepted laws of physics to make escape possible, it'd be good to itemize what those postulated hypotheses are. Rolf H Nelson (talk)

Bostrom's risk graph[edit]

Looks like it was deleted for some reason[2]. Anyone know what happened (or how to view the reason deleted?) -- Green Cardamom (talk) 13:39, 22 August 2013 (UTC)

Hmm.. another copy on Commons: File:X-risk chart.jpg .. -- Green Cardamom (talk) 13:41, 22 August 2013 (UTC)
What happened was that I bothered to get permission from the copyright holder to post the graph rather than just pretending I'm the copyright holder when I uploaded it like most people do. My theory is that this is so rare that nobody at Wikipedia knows how to handle this situation. I sent them an email within the 7 days required to avoid deletion, but all I got was a response by either a bot that didn't understand my email, or a human who didn't seem to actually read my email (I can't tell which). It's a minor pity, since the one I uploaded was a nice scalable SVG file. Rolf H Nelson (talk) 22:38, 25 August 2013 (UTC)

Inappropriate title[edit]

The name of this article (which specifically includes "planet Earth") contradicts with the second and third sentences of the first paragraph, which state that this article will not cover global (i.e. planet Earth-scale) issues. It appears that we need to decide just what exactly this article is about. Also... why has the title "existential risk(s)" been passed up? Wolfdog (talk) 05:19, 27 August 2013 (UTC)
You're correct, the lead is wrong and confusing. Further it does not reflect the content and scope of the article, which catalogs all events that threaten humanity on all scales: end of civilization; and/or human extinction; and/or planet catastrophe. The term "existential risk" has been used by some writers but it's not well established what it means, usually understood within context of something else ie. existential risk from technology means something different from existential risk from a planet destroying meteorite. In the end we still have to define the scope of the disaster, as Bostrom's chart shows. That's why the current title defines all scopes of any disaster type and thus covers all bases. I think if we renamed to "existential risk" there would be dispute over what existential risk means, for example Bostrom's chart only includes a narrow window in the top right and scenarios in this article would arguably have to be excluded from the definition (then the article would have to be renamed "Bostrom's existential risks"). We need a central article on Wikipedia to include all these scenarios in one place, the current title serves that. -- Green Cardamom (talk) 06:05, 27 August 2013 (UTC)
There's going to be even more dispute of what constitute a "Risk to civilization, humans, or planet Earth". But the article should stay broader than just existential risk because that's what the scope of the article has been. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
Wolfdog, should the article on human extinction be renamed to "existential risk"? I would argue no, because "human extinction" is less jargon-y. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
I don't think being swallowed by the Sun in 6 billion years is a "risk to planet Earth" in the media or literature or the English language. A risk implies a probability or a choice. Also, we shouldn't give WP:UNDUE weight to things that happen in billions of years compared with things that happen in the forseeable future. For example, an article on the "future of North Ireland" wouldn't mention that North Ireland will be destroyed by continental drift in millions of years, because that's not the focus of mainstream discussion. By my logic, "risks" to Earth are a very small subcategory of risks to humanity, consisting solely of LHC-style threats that reliable sources give low weight to, and therefore "and planet Earth" should be dropped from the title as redundant and WP:UNDUE. Rolf H Nelson (talk) 19:20, 28 August 2013 (UTC)
A large-enough meteorite would in fact be a risk to humanity and planet earth, at possibly any time. Since these events have never happened it's very difficult to slice and dice based on scope or time of disaster. We just don't know when or how big the disaster could be. Thus it makes sense to be as inclusive as possible. -- Green Cardamom (talk) 19:56, 28 August 2013 (UTC)
A giant meteorite is usually framed as a risk to civilization or humanity or the biosphere, not the Earth. Unless you mean anything with a negative and global effect across the entire Earth is a "risk to the Earth", in which case we'd have to include things like "light pollution" and "extinction of the dolphins" as risks to the Earth. -- Rolf H Nelson (talk) 21:41, 28 August 2013 (UTC)
Let's at least remove the Earth being swallowed by the Sun et al if nobody objects, since we know the minimum timeframe of that disaster. -- Rolf H Nelson (talk) 21:41, 28 August 2013 (UTC)

Thoughts on what to do with the "Potential Sources of Existential Risk" section, which currently includes non-existential risks? Does that become "Potential Sources of Catastrophic Risks", or should there be two sections, one for existential risks, and one for catastrophic risks? Rolf H Nelson (talk) 21:59, 31 August 2013 (UTC)

Expansion of the Sun[edit]

The following statement cannot be true: "Ignoring tidal effects, the Earth would then orbit 1.7 AU (250,000,000 km) from the Sun at its maximum radius." 1.7 AU is 85% of the *diameter* of the Earth's orbit. The Earth would have to move to a higher orbit in order for this to be true. Perhaps the author meant to say 0.7AU? This still seems too large (see The Sun as a red giant.) — Preceding unsigned comment added by 96.237.189.88 (talk) 22:32, 2 September 2013 (UTC)

As the article states, the Sun loses mass during the red giant phase. This causes the radius of the Earth's orbit increases from 1AU to 1.7AU. Rolf H Nelson (talk) 03:08, 4 September 2013 (UTC)


Andromeda and the Milky Way collide[edit]

When Andromeda and the Milky Way collide, this may present a threat to planet Earth (if planet Earth still exists). — Preceding unsigned comment added by Rvam1378 (talkcontribs) 22:56, 1 November 2013 (UTC)


A major flaw in this article[edit]

Much of the threats outlined in this article cannot be truly construed as a 'existential risk'.

-Events such as nuclear warfare, dramatic climate change, collapse of agriculture, etc etc most likely kill a significant percentage of human being and would cause destruction of human civilization, but most likely will not result in a complete eradication of human species.

-More serious threats described here (mostly of cosmic origin) such as impact events, gamma ray bursts, etc will most likely nearly destroy all human population (99%+). However, unless the event is truly catastrophic enough to wipe out all of vertebrae phylum, the fact that current human populations are so widely distributed will allow small isolated pockets of several individuals to survive.

-Even global pandemic with truly exceptional virulence (i.e. HIV that evolve to spread airborne) will probably fail to cause human extinction, due to the fact that humans have tremendous genetic diversity. Any such pandemic will most likely be survived by small minority of individuals.


Humans not only have high population and global distribution, but also have advanced technology and intelligence at their disposal. Therefore only legitimate 'existential risks' I see in this article are:

Near future

-artificial intelligence

-experimental accidents

-biotechnology (engineered pandemic)

-single cosmic event (GRB, impact event etc) of apocalyptic magnitude

-multiple cataclysmic events occurring happens back to back (i.e. within few thousand years of each other) without giving humans chance to recover

Far future

-expansion of the sun (if humans have not colonized other star systems)

-heat death of the universe

This article includes catastrophic risks to civilization, not just existential risks; there's a separate article more narrowly about Human Extinction. BTW don't forget to sign your comments with four tildes. Rolf H Nelson (talk) 23:48, 24 November 2013 (UTC)

Removed "clarification needed"[edit]

"Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century, have had their probabilities predicted with considerable precision" has been tagged "clarify|reason=How can it be stated "considerable precision" if we can't test "precisely" what the odds really are?" This sounds more like a criticism or counterpoint than a request for clarification, so I removed the tag. Of course, as always, the criticism can be added within the article if desired, if a source is given. If something's actually unclear, rather than contested, please clarify the clarification request here on the talk page. Rolf H Nelson (talk) 06:30, 11 December 2013 (UTC)

Title - dubious[edit]

As has been repeatedly raised in many of the discussions above, this article's focus and thus its title are dubious. I have tagged the article with a "disputed title" template, and have also recommended that the article undergo some kind of splitting process. A risk to civilization is very different from a risk to humans, which is very different from a risk to the whole planet Earth.

Civilization, humanity, and planet Earth are items that are not necessary to the existence of one another (except perhaps in one direction, and I don't, by the way, mean to imply that they aren't related). For example, a (hypothetical) peaceful, intentional dismantling of hierarchical social structures around the world is a "risk" to civilization, but not at all necessarily to humanity or to the planet. A large meteorite is a risk to all three, but then shouldn't that just be placed under an article titled something like "Risks to planet Earth"? It's not a risk specific enough to civilization, for example, to be placed under an article called "Risks to civilization." A meteorite is also a risk to horses, trees, house flies, diamonds, cardboard, works of art, ecosystems as a whole, or specific human individuals. But each of those doesn't deserve its own "Risks to..." article. If there is enough literature on risks to civilization, for instance, then that idea deserves its own independent article. Civilization, humans, and Earth, however, should not be sloppily lumped together like this in one article. Some writers, apparently wanting to lump even more items here, have mentioned "the annihilation of... even the entire universe" -- just demonstrating how easily out of hand this is getting. We seriously need some coherence here and probably some splitting. Wolfdog (talk) 17:10, 28 December 2013 (UTC)

If there isn't sufficient literature on "risks to civilization" as a category to merit an article, then the problem solves itself; we can just delete much of the current article as insufficiently sourced. However, I suspect there is sufficient literature, and I suspect that said literature includes risks of human extinction to be within its purview. I know the FHI's Global Catastrophic Risks included human extinction. Rolf H Nelson (talk) 03:55, 1 January 2014 (UTC)
What would you propose should go in a "Risks to planet Earth" article besides (arguably) metorites? Rolf H Nelson (talk) 03:55, 1 January 2014 (UTC)
If that is the only sourced risk to planet Earth, then this article hardly needs to include "planet Earth" in its title. There doesn't, in that case, need to be any article about risks to planet Earth. We can just discuss such an issue under "Meteorites." Wolfdog (talk) 23:30, 5 January 2014 (UTC)

There are plenty of sources that cover the topic as a whole, so there is no problem with the topic itself, nor a need to split the article. As mentioned previously the article title is just a placeholder to help readers understand what the article is about. If you go to "What links here" and look at the "Redirects" you will see dozens of alternatives, the others are less clear and more ambiguous. Wolfdog .. how familiar are you with existential studies and scenarios? The article doesn't say the things you are saying, it doesn't say that one thing is dependent on another. Nor does the literature make such neat and fine distinctions between civilization and other risks .. we have no idea how far ranging these risks may be, they are just simply "risks" that could impact things on the various scales. The title should not be interpreted so literally the point of trying to split it into fine categories. The literature itself makes no such categorized distinctions, nor should we. The article doesn't make that distinction either. It can read "Risks to civilization and/or humans and/or and/or planet Earth", but that would be long and awkward. I'm afraid the confusion here is an overly-literal interpretation of the title without reading the lead section that is more nuanced, and where the actual scope of the article is defined. -- GreenC 16:34, 1 January 2014 (UTC)

We do not need to split the article if, as you say, it is sufficiently sourced and referred to in the sources as one unit. But doesn't an article about risks to three different (and specifically named) issues intend to cover those three as if they are similar enough to fall under a single page? If the article isn't about "such neat and fine distinctions" regarding risks, but rather about "just simply 'risks'" of various (but basically huge) scales, then the title still seems awkward and inappropriate. What is your position regarding other users' past suggestions that this title be moved back to the name "Existential risks"? Of course, the problem with this title in the past was that it seemed too unclear, a quality you mention as being undesirable. However, the phrase "existential risk (or catastrophe)" does recur quite often throughout the article (and without any set definition). How about something clearer but still broad, like "Global catastrophic risks" (which also shows up in the article)? (However, this seems biased to only one of the three risks aforementioned, which I feel is inevitable with most titles; that's why I suggested splitting to begin with.) I'm not an expert on existential risks, but I agree that since "the literature itself makes no such categorized distinctions, nor should we." However, I disagree with your comment that "The article [title(?)] doesn't make that distinction either." I feel that the title makes three blatant distinctions (perhaps arbitrarily chosen, but then sloppily so). If those are not the only three items in question, then the title is (amazingly) too narrow and not broad enough. What are we really talking about here in this article? Wolfdog (talk) 23:30, 5 January 2014 (UTC)
Is your current proposal "global catastrophic risks"? I'm in favor as I've stated before, but I'd like to hear what Green Caradamom and/or others think. My main personal concern is if anyone would find "global catastrophic risks" unclear to laymen or too jargon-y, I think it's fairly straightforward though. "Global catastrophic risks" gets about 10x hits as "Risks to civilization" in a quick Google Scholar search, though admittedly most of those uses are from Nick Bostrom and associates. Rolf H Nelson (talk) 02:54, 8 January 2014 (UTC)
This is a topic without a commonly accepted term, there are many terms, at least 42 of them, so there will always be lack of clarity no matter what title we choose. Looking at Bostrom's book who wrestled with this same problem, he titled it Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards .. using that as an example, could call it "Global catastrophic risk scenarios and related hazards" .. the "related hazards" leaves the title open to interpretation (ie. for more information read the article), although this too can cause complaint about lack of clarity. Ultimately it's up to us editors to decide what the scope of this article will be, and scope is defined in the lead section, not by the title, so any title is defensible so long as as it mirrors the article. But any title will be open to complaint since there is no commonly accepted term for it. -- GreenC 04:43, 8 January 2014 (UTC)
Interesting points. I was simply observing that, again and again, users have attacked the title as too broad, vague, etc. (and, of course, 42 other options doesn't help the fact that we must come to a conclusion on one name). I guess I would, yes, personally go with "Global catastrophic risks," though I too would like to hear other users' input since, again here, I'm not an expert on the issue at all. It's occurring to me now that what's making me most uneasy, I suppose, is that "civilization" is the least like the other two. If we can clarify the title even a little bit, why not? I'm looking now at the various risks (or sources of risks) themselves. Of them, almost all (warfare and mass destruction, man-made global warming, eco-disaster, population/agricultural crisis, experimental accident, biotechnology, pandemic, ice age, volcanism, etc.) in particular concern the destruction of planet Earth or enough of Earth's environment that it can be assumed humans could very suddenly go extinct. To me, to throw "civilization" in the mix as well is trivial and possibly misleading. The only two issues I can see that fairly specifically focus on the destruction of civilization(s?) are "Artificial intelligence" and "Extraterrestrial invasion." And even these two can be simply considered generalized global risks. So, to make this long story short: civilization appears the most out of place in the title. Wolfdog (talk) 22:11, 8 January 2014 (UTC)
"warfare and mass destruction, man-made global warming, eco-disaster, population/agricultural crisis, experimental accident, biotechnology, pandemic, ice age, volcanism" Is your point just that any risk to civilization could, in theory, cause humans to go extinct if we had enough bad luck? I think most sources would disagree that an ice age, for example, would likely somehow kill everyone off every last human being alive. Rolf H Nelson (talk) 04:24, 9 January 2014 (UTC)
Well now you're bringing in a whole new debate and I think a new set of premises. Calling an ice age a "risk to civilization" seems trivial (just like we wouldn't say that an ice age is a risk to computer technology or a risk to the literary arts or a risk to vacation resorts -- these are all true, but they're trivial). What is relevant is that an ice age is a "risk to human existence (as a whole, or, at least, in large part)" or a "risk to much of the life on planet Earth." And in response to your doubts about an ice age killing "everyone off every last human being alive," I must say that a risk that leaves only a handful of organisms of some species alive, from an ecological point of view, is still a risk to that whole species. It is ecologically highly unlikely that 10 surviving red foxes will reproduce successfully enough in a newly devastated ecosystem to continue existing as a species; in other words, those red foxes (and so their whole species) are in the midst of an extinction event. (Likewise, an ice age is certainly a risk to human existence.) Wolfdog (talk) 05:16, 9 January 2014 (UTC)
Now here are a few titles I'm proposing: [1] Global catastrophic risks | [2] Existential risks (despite this title's vagueness, at least this doesn't throw "civilization" bizarrely into the mix -- its definition can be discussed more fully in the article itself, and perhaps even explained as controversial with many interpretations) | [3] Risks to humanity (or to the human race or to the human species) and planet Earth (this is one that, personally, seems still too specific; since we have not defined this article and are allowing the opening paragraphs to flesh it out, it is probably best that we have a broader rather than narrower title) .........Other ideas or do any of these suffice? Wolfdog (talk) 05:29, 9 January 2014 (UTC)
Given that the concept of 'risks to civilization' passes WP:NOTABILITY, changing the set of items *in this specific article* covered to exclude risks to civilization, without placing them in a specific alternative article, is not an option. Direct, rather than indirect, risks and threats of human extinction are covered (which includes the Sun's expansion), in human extinction. The article on risks to civilization can't exclude human extinction risks because the sources cited don't appear to exclude human extinction risks (unless, *maybe*, if the article gets too large and needs to be split). So I don't see the article scope changing. Rolf H Nelson (talk) 05:04, 10 January 2014 (UTC)
I'm fine with Existential risk, it's more obtuse and academic but this is Wikipedia we are supposed to be academic. Bostrom's book is called Existential Risks so there is support in the sources. The definition of existential risk, according to Bostrom, covers the current title (civilization, humanity, planet earth) so no problem with article scope. I think "global catastrophic risks" works also but that would be my second choice because "global" will become a source of contention (a certain disaster scenario may not be global scale yet count as an existential risk). -- GreenC 15:59, 10 January 2014 (UTC)
I think there's some (understandable) confusion here. By Bostrom 2002's definition, most risks to civilization aren't existential risks; only risks that civilization would be permanently crippled, rather than being able to recover in a generation, are existential risks. "Repressive totalitarian global regime" is one example he gives. Outside of Bostrom, the scope of "existential risk" seems to be more narrow, focusing more squarely on human extinction. Bostrom et al's Global Catastrophic Risks has a broader focus, more along the lines of what most people would characterize as "risks to civilization" (although a bit broader, including catastrophies that kill a lot of people but don't imperil civilization.) Please let me know if you disagree, this is a pretty important consideration towards making the article consistent and understandable. Rolf H Nelson (talk) 23:10, 11 January 2014 (UTC)
I see. Sorry for the confusion. You're right Bostrom's book is called Global Catastrophic Risks, and the Oxford based Future of Humanity Institute has a http://www.global-catastrophic-risks.com/ (chaired by Bostrom) and there is the http://gcrinstitute.org/ (Global Catastrophic Risk Institute) founded by Seth Baum .. so those are some evidence-based reasons to use it. The danger becomes we don't have a POV problem supporting certain researcher's coinage. But it helps it's not just Bostrom. -- GreenC 04:51, 12 January 2014 (UTC)
Perhaps the problem for me here is that the sources all use civilization and then rarely define it, though it may have various definitions (and though I have a very strict definition of it in my head, used for example in social sciences like anthropology; civilization: a type of human society based on densely populated settlement(s), agriculture as the almost total source of food supply, importation of resources, and constant expansion). If the sources are unclear, I guess there is nothing to do about the title. It is merely reflecting those references who loosely throw the term around. The way I see the term "Risks to civilization" is used in contradiction with the sources who seem to use it almost to simply mean "Risks to a large portion of humanity." Is that the case? Wolfdog (talk) 02:03, 13 January 2014 (UTC)
I agree that our sources don't generally distinguish between "risks to a large portion of humanity" and "risks to civilization". Perhaps this is a "contradiction", or perhaps it's just pragmatic because most risks to one are also a risk to the other. Rolf H Nelson (talk) 01:51, 23 January 2014 (UTC)

Requested move[edit]

The following discussion is an archived discussion of a requested move. Please do not modify it. Subsequent comments should be made in a new section on the talk page. Editors desiring to contest the closing decision should consider a move review. No further edits should be made to this section.

The result of the move request was: Page moved to Global catastrophic risks Rolf H Nelson (talk) 04:42, 29 January 2014 (UTC) (non-admin closure)



Risks to civilization, humans, and planet EarthGlobal catastrophic risks – Nick Bostrom et al's term "Global catastrophic risks" is a more apt name. Although it may be considered a broader title than the current one, this is appropriate for the expanse of the article's content. Discussions and multiple citations here demonstrate Bostrom's position as the foremost scholar in this area, who has himself contended with the controversies of naming such class of risks. [3] Wolfdog (talk) 23:39, 20 January 2014 (UTC)

  • Regardless of any scholar's naming policy, we follow Wikipedia's here. But WP:CONCISE seems to lead me to strongly support. Red Slash 03:54, 21 January 2014 (UTC)
  • Support. -- GreenC 04:43, 21 January 2014 (UTC)
  • Support. Makes sense, including for succinctness. DA Sonnenfeld (talk) 14:56, 21 January 2014 (UTC)
  • Support. Moving since there's no evidence of controversy. Edit: actually I didn't, the 'move' button is missing, probably it gets suppressed while a Requested Move discussion is open. Rolf H Nelson (talk) 02:01, 23 January 2014 (UTC)
  • Support per conciseness as mentioned. Current title could conceivably allow for any and all human risks. Not sure the suggested title is ideal as I don't think it's a phrase in common usage but I lack a better suggestion. (Also it's nice to avoid the US English and Oxford comma.) benmoore 22:07, 23 January 2014 (UTC)
  • Support. Better than current title, although I would prefer "Risks of Human Extinction" or "Existential Risks." Although far less likely to end humanity that the classically included x-risks, generic GCR's can still be in this category since they could push us towards a more fragile state or even prevent long term "technological maturity." Anthropic Optimist (talk) 13:30, 27 January 2014 (UTC)
  • Support. Better title. GCR usage in the literature does encompass creeping risks beside the dramatic catastrophes, so the article contents still fit. I think it might be helpful later to separate out the issues and explanations unique for existential risks (finality, special moral weight) to a separate section or their own page, but the base risks do belong here. Anders Sandberg (talk) 02:10, 29 January 2014 (UTC)
The above discussion is preserved as an archive of a requested move. Please do not modify it. Subsequent comments should be made in a new section on this talk page or in a move review. No further edits should be made to this section.

Global catastrophic risk vs risks[edit]

Speaking of this class of risks in the singular is awkward and unusual. The article is about a class of risks that includes many risks. By making it singular it is no longer about a class but about a single risk, grammatically speaking. WP:SINGULAR says "Exceptions include .. the names of classes of objects (e.g. Arabic numerals or Bantu languages)." It's the same here, this topic includes many risks and should reflect that, we don't say "Arabic numeral" unless the topic is about a single number. -- GreenC 22:18, 30 January 2014 (UTC)

Ice age section incorrect?[edit]

I believe there were civilizations primitive by today's standards but they were organized into societies and had navigation and architecture. Irony here being that civilization technically already ended before, if that's correct. Without a citeable reference, I'll leave this out of the article. No, Ancient Aliens doesn't count! 2601:1:9280:155:E179:39EF:33A1:CDAB (talk) 03:22, 17 April 2014 (UTC)

Splitting off existential risk[edit]

An editor has been attempting to fork this article, creating a new article existential risk. This is a significant re-arrangement of ideas that needs consensus. I disagree with the change as it's a confusing and unnecessary. This article is already about existential risk. It's too difficult and confusing to have two articles on the same topic. If required we can entertain renaming this article to existential risk, but we just had a major discussion on what to name this article (see above) and there was no consensus for naming it existential risk. -- GreenC 13:45, 13 June 2014 (UTC)