Talk:Technological singularity

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Former good article Technological singularity was one of the Social sciences and society good articles, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these issues have been addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.
Article milestones
Date Process Result
August 7, 2005 Peer review Reviewed
July 19, 2007 Good article nominee Listed
July 7, 2008 Good article reassessment Delisted
Current status: Delisted good article

First sentence[edit]

I have some concerns that every few weeks or months when I look back on the first sentence of this article's lead, it is often largely or entirely rewritten (including some adjustments by myself). No rewritings or edits to the sentence ever seem strictly wrong or in bad faith, yet such constant variation unsettles the integrity of the rest of the page. Is there any way we can form some kind of a group consensus on a basic first sentence that can withstand the test of time and the huge degree of editing/variation this page receives? Do other people also feel this is an important task? Wolfdog (talk) 21:59, 14 July 2016 (UTC)

I concur it's a problem - per the above section, I think it's a symptom of the whole article being not very good and not up to Wikipedia standard. There has been past resistance to intro changes that don't sufficiently push the same line as the article body, so I'd say get the article body up to sourcing scratch first and see where we can get from there - David Gerard (talk) 22:11, 14 July 2016 (UTC)
What's wrong with separating it in two sentences? --Edoe (talk) 00:01, 17 July 2016 (UTC)
@Edoe: Nothing, I guess (though that's not my preference). But that's not really my issue anyway. @David Gerard: (and others:) Before July 14 (2016), there were three sources linked to the first sentence. David, is there any reason why you did away with them? I found two of them for us to be able to review on Google Books: 1 and 2. Reference [1] links the singularity to "the most powerful 21st century technologies [that] are threatening to make humans an endangered species" and directly says it is "an event or phase [huh?] that will radically change human civilization, and perhaps even human nature itself, before the middle of the 21st century" and this sentence itself has four sources cited. The part of [2] that I read simply says the idea of the singularity is a tangled mess and prefers to discuss the subset of that idea that "interests us here": an "intelligence explosion" or "the prospect of machine superintelligence," while presuming that these descriptions are sufficient enough for the reader to understand without more detail. Admittedly, though, the viewable portion of the Google Book ends shortly after that. Are any of these sources, or others, usable? Wolfdog (talk) 14:40, 17 July 2016 (UTC)
The edit was admittedly a bit of a quick hack on my part, so please do feel free to put back anything particularly relevant. The Joy article is in the pile of non-refs linked below for your cut'n'paste convenience. That second link is Bostrom, who is a famous opinion on the subject (even though I think the actual book is redigested glibness) so may be quotable on that score - and does clearly credit Vinge with the idea's popularisation - David Gerard (talk) 16:06, 17 July 2016 (UTC)
I'm thinking we start off the first sentence of the article as simply as possible, yet with a source or two, to provide stability to the sentence. There's plenty of room for nuances and complexity in the rest of the article. Here's my thought:
  • The technological singularity is the hypothetical emergence of an artificial superintelligence [before the 21st century?] that will radically change human civilization or even human nature.
This is based on what David Gerard is calling the Joy source. Thoughts? Wolfdog (talk) 13:45, 18 July 2016 (UTC)
"Radically change" is unclear in the context of a first sentence. Vinge 1993 characterizes it as the end of the "human era", which is more descriptive. Changing "human nature" doesn't seem an *intrinsic* part of the Vingean concept. The three key elements of the Vingean concept seem to be that the change is more profound than any seen before in history ("the human era will be ended"), that it is triggered by superhuman intelligence ("the cause of this change is the imminent creation by technology of entities with greater than human intelligence"), and that the change will be abrupt ("...this change will be a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control.") (Vinge 1993). So that's how I'd like to scope the article; anyone can feel free to propose a different scope. If people like the scope but not the lede, here are some other proposals with the same scope (recall that there's no requirement that the title appear in the first sentence):
  • The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to civilization and signalling the end of the "human era".
  • In 1993, science fiction author Vernor Vinge famously predicted that any invention of a superhuman intelligence would abruptly trigger runaway technological growth, resulting in unfathomable changes to civilization and signalling the end of the "human era". Scholars actively debate whether such a technological singularity is plausible; whether it is likely to occur in the 21st century; and what the aftermath of this sudden state of extreme technological advancement would be: predictions run a gamut from Vinge's fears of human extinction or enslavement, to transhumanist Ray Kurzweil's utopian vision of immortal spacefaring citizens whose every material need can be instantly satisfied. Rolf H Nelson (talk) 04:12, 20 July 2016 (UTC)
@Rolf h nelson: I really like your first bullet there. It uses simple language that anyone get (though the term "human era" seems a little cloudy, which you seem to note yourself by putting it in quote marks). Being that this concept is often popularly discussed in sci-fi and tech circles, I feel we have a duty to keep the first sentence a short-and-sweet encapsulation for our lay-reader. The rest of the article is the appropriate place for elements from your second bullet, which goes into all kinds of important but more in-depth information. Unless there are any other issues, I think we should go for it! Wolfdog (talk) 14:46, 20 July 2016 (UTC)
I like that first bullet point as well, though I'm a bit more leary than Wolfdog about the "human era" bit. The idea -as far as I understand it- is that the production of smarter-than-human AIs would permit the production of smarter-than-those-AIs AIs, which would in turn permit the production of still smarter AIs. That seems as likely to result in a Post-scarcity economy (because those ultra smart AIs can design things other than future AIs, of course) as it is to signal the imminent extinction or irrelevancy of humanity. Hell, it might result in something similar to the Dune universe, or possibly even a religious mythology by way of events similar to The Last Question. The problem here is that the effects of a singularity are, by definition, unpredictable. Hence, we should not be speculating on what they might be. I would suggest the following:
  • The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, and result in a series of unpredictable changes to civilization. MjolnirPants Tell me all about it. 15:21, 23 July 2016 (UTC)
"The problem here is that the effects of a singularity are, by definition, unpredictable. Hence, we should not be speculating on what they might be." I agree with the sources I've seen that it's worthwhile trying to analyze what the effects of a singularity would be; is there a source that says otherwise? In particular, I'm not seeing where the "it's so unpredictable that we can't even predict that it'll be unpredictable, so maybe it'll be the same as now" meme comes from; the closest I've found is some sources claiming that it's unfathomable, in the sense that society will be so changed that we can't even comprehend what will happen next. Rolf H Nelson (talk) 21:06, 24 July 2016 (UTC)

"...result in a series of unpredictable changes to civilization". Sure, let's talk about that in terms of what we want the scope of the article to be. A key element of the Vingean hypothesis is what Vinge calls "change comparable to the rise of human life on Earth". "Unpredictable" in the lede doesn't really capture that; lots of things in the future are unpredictable. In terms of scope, you can break out two hypotheses, one is that it's post-scarcity but otherwise probably looks normal, kind of like the industrial revolution. The other hypothesis, which seems to be in more common use, is that the change is much more dramatic. Rolf H Nelson (talk) 21:06, 24 July 2016 (UTC)

I agree with the sources I've seen that it's worthwhile trying to analyze what the effects of a singularity would be That would be Original Research and it's not permitted here. I agree that speculating about the effects is worthwhile. Just not here.
"it's so unpredictable that we can't even predict that it'll be unpredictable, so maybe it'll be the same as now" Nobody has suggested that. In fact, I suggested quite the opposite. Please read my comments if you plan on responding to them, else I'm likely to start simply ignoring you.
the closest I've found is some sources claiming that it's unfathomable, in the sense that society will be so changed that we can't even comprehend what will happen next. It is axiomatic that one cannot predict the behavior of a being much more intelligent than oneself, absent some well-understood situational constraints on it. It doesn't matter how many or how few people have pointed this out, it is true in such a way that it could never be untrue. It's debatable whether the statement would even require a source were it to appear in article space.
"...result in a series of unpredictable changes to civilization". Sure, let's talk about that in terms of what we want the scope of the article to be. No. This article will not speculate on what the effects will be. It may report the speculation of notable, prominent figures, but it will clearly attribute those speculations to them. MjolnirPants Tell me all about it. 17:44, 26 July 2016 (UTC)
You do actually see this pop up occasionally on the Internet, for example people saying "the singularity has already happened".
I guess we'll have to agree to differ, then. The AI community, like me, agrees that I can successfully predict that Deep Blue will probably beat me in chess. Rolf H Nelson (talk) 03:08, 27 July 2016 (UTC)
First off, do not edit other user's comments. It is extremely rude.
I haven't been editing anyone's comments. Rolf H Nelson (talk) 03:51, 29 July 2016 (UTC)
Second, Deep Blue has nowhere near the overall processing power of even the most developmentally challenged person capable of using a computer. A hundred Deep Blues wouldn't have the processing power of a single human brain. You may think of it as some archetype of a powerful computer, but my laptop can churn through more FLOPS than Deep Blue. A LOT more if you count my GPU, too.
Finally, beating you is an outcome, or a result. It's not a behavior. Even if one stretches the definition of 'behavior' to include the relative success or failure of actual behaviors, you're still talking about the rules of chess, or the exact caveat I gave: "...well-understood situational constraints..." MjolnirPants Tell me all about it. 05:29, 27 July 2016 (UTC)
  • After having read through the options again, and giving it some consideration, I feel like the current first sentence is better than any of the options proposed here, including my own. MjolnirPants Tell me all about it. 13:43, 27 July 2016 (UTC)
To be honest, I disagree and still feel a straightforward simple first sentence is preferable to what we have now. All three of us already agreed on some variation of "The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable[ or unpredictable] changes to civilization and signalling the end of the 'human era'." Rolf H Nelson prefers "unfathomable" over "unpredictable" which seems logical to me. If the "unpredictable/unfathomable changes" idea still seems to vague, we could merge together the aformentioned proposed sentence with the current one, and attain something like "The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in a superintelligence that will, qualitatively, far surpass all human intelligence." Wolfdog (talk) 13:11, 30 July 2016 (UTC)
I'm okay with your proposal there. MjolnirPants Tell me all about it. 14:17, 1 August 2016 (UTC)
Sorry, I just realized my proposed sentence is redundant. I use the term intelligence three times! Probably the best merger sentence of what everyone has written here is: "The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to civilization." Wolfdog (talk) 13:11, 30 July 2016 (UTC)

Date inconsistencies?[edit]

John von Neumann died in 1957 but some sections of this article credit his work to 1958. — Preceding unsigned comment added by 175.156.67.226 (talk) 11:45, 25 September 2016 (UTC)

The publication was posthumous. Rolf H Nelson (talk) 05:12, 26 September 2016 (UTC)

a "big" guestion...[edit]

(excuse me if i make some mistakes in my english!)...I don't have a lot to say, i will be very sort, i have only one guestion, you all guys that you develop so fast, so "almost" "complicated" technology for human's life's, and you "say?" or "thing?" that all that tech is usefull for humankind ok?(and i thing is "normal" all that "fast forward" grow of civilatation's,) because i know that is in our "blood" to progress and discover "everything" in our life's, BUT! (and put in your mind that this im going to say is MY guestion and my answer), have you ever thing just to ASK all humankind if we want all that tech so "deep" inside" in our live's??? and i don't mean that is "bad", (good for world economy maybe), but "good" for human's???, my reguest or the way of my personal thinking is that if we go "very deep" in human's "mind" and body " tech" ,i think that the "people's "normal" "program" is going to "fail"!!!,,,So as i am an Earth's citizen, i beg you all (with a big respect to all scientist's),to take it "slow" "soft" and "easy"!,,, because we are so "complicated "machines" and i think that the human body and mind has unpuzzle secret's!...Thank"s a lot for read in me, and don"t worry im from Greece and i believe in progress and new tech, but please take it easy!!!! that's all.... Ευχαριστω.... — Preceding unsigned comment added by 2A02:2149:8233:EA00:19B:5073:7217:5858 (talk) 21:23, 18 February 2017 (UTC)

Merger proposal[edit]

I propose that Intelligence explosion be merged into Technological singularity. They're really quite similar and the sources often don't make any distinction between the two. K.Bog 04:13, 28 March 2017 (UTC)

Oppose – The topics are closely related, but distinct. The technological singularity is a point in time. It is the instant in which the very first computer or robot that has human-level or better intelligence comes into existence. Its article is analogous to the article gravitational singularity, a phenomenon which is immediately followed by the Big Bang. Those two phenomena are very closely related. Analogously, it is believed that the technological singularity will be followed immediately by an intelligence explosion (the progression towards superintelligence). They are also very closely related. And just like the singularity and the Big Bang, technological singularity and intelligence explosion are distinct concepts. It is important to make the distinction between the beginning of AI and what comes after. Like covering the stages in the evolution of the universe, or the stages in the evolution of a star, all of which have an article on them. They each have notability as a distinct topic, kind of like embryo and human body (one turns into the other, but they are worthy of their own articles). The Transhumanist 20:14, 28 March 2017 (UTC)
There is no question that technically they are different ideas. But the first relevant question is whether sources actually make a distinction between the two. They generally do not, and therefore it would be extremely hard to write an article without giving personal intepretations and doing original research. The second relevant question is whether you can have separate articles without the content mostly being duplicate. Look at the article for intelligence explosion right now: it talks about speed and intelligence of AI, superintelligence, existential risk, and the rate of improvement. All these topics are covered in detail in the technological singularity article.
A gravitational singularity happened to occur at the time of the Big Bang, but there is plenty of potential for a gravitational singularity to exist elsewhere, like in a black hole. On the other hand, I can't fathom how an intelligence explosion could exist without a technological singularity. The intelligence explosion and the singularity happen concurrently, not one followed by the other. K.Bog 21:12, 28 March 2017 (UTC)
  • Support for pretty much the exact same rationale the OP gives, initially and in response to the oppose !vote. Particularly: But the first relevant question is whether sources actually make a distinction between the two. They generally do not, and therefore it would be extremely hard to write an article without giving personal intepretations and doing original research. and I can't fathom how an intelligence explosion could exist without a technological singularity. Those quotes represent my views exactly. ᛗᛁᛟᛚᚾᛁᚱPants Tell me all about it. 00:19, 29 March 2017 (UTC)

Recommendation for pop culture section[edit]

I think a good pop culture reference to add to this page is the anime "Ghost In The Shell". In the story a fusion of a cyborg and ai creates a super-inteligent AI. — Preceding unsigned comment added by Jamezism (talkcontribs) 05:07, 10 July 2017 (UTC)