|This article must adhere to the biographies of living persons policy, even if it is not a biography, because it contains material about living persons. Contentious material about living persons that is unsourced or poorly sourced must be removed immediately from the article and its talk page, especially if potentially libellous. If such material is repeatedly inserted, or if you have other concerns, please report the issue to this noticeboard. If you are connected to one of the subjects of this article and need help, please see this page.|
|This article is of interest to the following WikiProjects:|
Alan Kay didn't invent OO (that's the contribution of Ole-Johan Dahl and Kristen Nygaard, in the form of Simula). However, he took the set of concepts from Simula and developed them further into a paradigm (something that I personally regard as a very dubious thing, and both the "everything must be in a class" and "everything 'just responds to messages'" thingies are doing more harm than good IMNSHO, but that would leave the neutral point of view *g*). (I'm putting this into Discussion because I'm not ready to word that properly in the article - somebody else please step in.)
Since this is primarily an article about Alan Kay and not about OO or PARC or the Norwegian Computing Centre and the people there who worked on Simula, I tried putting the mention and the link to it in the most concise way possible. At the same time, I just had to add that Kay was not the only one at PARC who was responsible for this, otherwise the mention of the Norwegian Computing Centre alone would have given that impression. AlainV 20:32, 1 Jul 2004 (UTC)
Actually, Alan Kay did coin the term object oriented and he now regrets it because people have a different idea than what he had in mind. He fully gives credit to Simula and previous developers for the design and says that SmallTalk is based on ideas in Simula.
-- Cleo Nov 17, 2005
There's too many external links - someone care to take the time to weed through them, and pick the best? JustinHall 21:36, 22 July 2005 (UTC)
A good start is:
- I just blithely ignored this injunction. Sorry. I added a link to his Turing Award lecture (2003), which has the same title as his OOPSLA 1997 and EDUCOM 1998 talks. Haven't watched them, so I don't know if there are any significant differences in content or quality. However, the Turing Award lecture is notable in itself. Barring any objections due to the considerations above, I think it's the one that should stay. --Clconway 07:46, 22 June 2006 (UTC)
Possible misinformation: Apple, Disney
In the same session that User:126.96.36.199 vandalized the entry for Applied Minds, someone using the same IP address made edits to this article. Here's the link to the edits. --Zippy 16:32, 2 March 2006 (UTC)
On the source of "The best way to predict the future is to invent it"
I have removed "In reality, Dr. Greg Ralbovsky said this at the 1994 AAAS conference on natural language syntax processing."
Alan Kay claimed that he said this, back in 1971, and the quote itself was published in "The Early History of Smalltalk", by the ACM: SIGPLAN notices, V28, N3, March 1993. The full quote is:
"The best way to predict the future is to invent it. Don't worry about what all those other people might do, this is the century in which almost any clear vision can be made"
- I believe the parenthetical note "(This is likely originally from Dennis Gabor who published Inventing the Future in 1963.)" is an interesting footnote, but it suggests a degree of plagerism and detracts from the power of the Kay quote. "The best way to predict the future is to invent it" is a unique concept, especially when taken with other comments that Kay has made expanding on this theme.
- I believe that the Dennis Gabor note should be removed, Gabor's work should stand on its own merits. YORD-the-unknown 09:46, 27 August 2006 (UTC)
- Follow-up: In addtion to the initial quote, Kay elaborated on his theme on several occassions. In 1984, in a paper published by M.I.T. he elaborated: "The future is not laid out on a track. It is something that we can decide, and to the extent that we do not violate any known laws of the universe, we can probably make it work the way that we want to."
- Action: Unless someone offers a reasonable objection, I will remove the Dennis Gabor note from the Alan Kay quote. YORD-the-unknown 17:25, 29 August 2006 (UTC)
- Follow-up -- Hello 188.8.131.52, Several months ago I questioned your comment about Dennis Gabor in the User Talk section of the Alan Kay biography. See http://en.wikipedia.org/wiki/Talk:Alan_Kay, comment "On the source of 'The best way to predict the future is to invent it'". I believe my comments are still valid. Please be so kind as to explain your rationale as to why your parenthetical note provides any value to the collection of Alan Kay quotes. What specifically did Mr. Gabor say that you feel justifies your note? Generally speaking, quotations do not require footnotes unless you are acusing Dr. Kay of plagerism. Dennis Gabor's body of work is substantial and important, and really stands on its own merits. Sincerely, YORD-the-unknown 00:35, 19 March 2007 (UTC)
- Perhaps this will help clarify the issue: http://www.cc.gatech.edu/fce/c2000/pubs/nab97/index.html refer to "Human-Centered Design" YORD-the-unknown 00:57, 19 March 2007 (UTC)
On March 19, 2007, User "184.108.40.206" reverted the Alan Kay quote to assert that the quote was perhaps derived from the work of Dennis Gabor. Following is a link to the discussion "Comments about Alan Kay" -- http://en.wikipedia.org/wiki/User_talk:220.127.116.11 YORD-the-unknown 15:04, 20 March 2007 (UTC)
A few comments on the article and the comments
From Alan Kay --
I would have written this bio a little differently, and certainly would have been a little more accurate.
For example, Ivan Sutherland did the Sketchpad work in 1962 and published his PhD thesis in 1963. I saw it in 1966 when I first stared grad school at the U of Utah (and this was 3 years before Ivan came out to join up with Dave Evans).
(Also, this is when I quit being a professional musician because I got so busy with ARPA research.)
So I didn't work with Ivan on Sketchpad, but it was the largest inspiration for my object ideas. Simula was next (and partly because I was "forced" to look at Simula I in the same week as reading Ivan's thesis). The similarity between these, the roots of Biology and Algebra, and the talked about ARPAnet to be, catalysed a particular view of computation as made up completely of independent "computers" communicating by messages.
Much of this early history was chronicled for ACM's 2nd History of Programming Languages gathering in 1993 and is published in a book of the same name in 1996. This history has been well vetted by colleagues and is as accurate as a short history can be.
The term "object" in the early sixties was used for compound data structures. Doug Ross at MIT had written an early influential paper about embedding pointers to procedures in such data structures (and this was referenced by Ivan in his thesis). The original Simula insight was that Algol blocks should be independent entities (and this automatically created a structure that had propertics, procedures, and a main routine that could be coroutined with other such entities).
I loved both these systems, but I loved Sketchpad more, because it had other quasi-biological and particle and field properties that I thought would aid scaling. I regret saying "object-oriented programming" when someone asked me what I was doing, because it presents an object as too static and 0nly a responder (where both Sketchpad, Simula, biological cells and computers, are all independently active).
In any case, "OOP" got turned into a "paint" after the success of PARC. The creator of C++ said he wasn't going to go as far as Smalltalk, but he thought the C community could benefit from the same kind of preprocessor that Simula used to transform Algol -- C++ is very Simula-like -- and was called object oriented (and so have many of its successor). This forced us to call Smalltalk and CLOS "dynamic object oriented languages", and most of the programming community today has no idea what this means.
However, Smalltalk was only really interesting in the 70s -- it represented a real jump in expressibility and leverage.
But today, it matters not that Smalltalk was an "improvement on its successors" (as Tony Hoare said about Algol). None of the so-called OOP languages around today are above threshold to deal with programming in the 21st century. I think this is a huge problem, that is made more severe by the vocational temptations to "get good at something bad" in order to make a living. This has produced a staggering legacy of moribund code, that makes it hard for young people especially to think about qualitatively better ways to proceed.
There are a few other errors in the article but none serious. But this brings up another question. The wikipedia is a wonderful creation, but so many of the articles are essentially opinions, sometimes using secondary sources. In computing, most of the folks who did things in the sixties and seventies are still alive, so why not just ask them to comment when their bio is entered as an article?
- Just because Alan Kay can be levelheaded and reasonable in commenting about his wiki entry, does not mean that all people can. Yet, to solicit such input would obligate wiki editors to give [possibly undue] weight to such comments. The effort to make the Wiki encyclopedically neutral is the primary reason to deoend on "reliable sources" rather than original research; soliciting feedback from the subject of an article would definitely be OR. Bustter (talk) 08:15, 14 February 2013 (UTC)
Some unidentified poster made the following comment about an Alan Kay quote, "There does not seem to be any authoratative source that he actually said this, however." The poster apparently does not understand what the Smalltalk.org organization is or its relationshop to Alan Kay. Unless someone can provide an actual refutation of the Kay quote, I'll strike this posters comment within one week. The quote in question: "I invented the term Object-Oriented, and I can tell you I did not have C++ in mind.". YORD-the-unknown 21:52, 28 January 2007 (UTC) Alan Kay had a qoute. The full qoute said "Don't worry about what anybody else is going to do...The best way to predict the future is to invent it. Really smart people with reasonable funding can do just about anything that doesent violate too many of Newton's Laws!" —Preceding unsigned comment added by 18.104.22.168 (talk) 01:59, 23 April 2009 (UTC)
- The source of this quote is from his talk: The Computer Revolution Hasn't Happened Yet at OOPSLA 1997. I transcribed the talk from the recording made available at Google Video. Moryton (talk) 05:44, 5 January 2008 (UTC)
There is a citation, but the source is missing. I'd vote to remove it until the source of the citation can be found and attached to the file (just store it here)
I'm not sure this can be regarded as a valid source or not: http://www.openp2p.com/pub/a/p2p/2003/04/03/alan_kay.html Quote: Kay admires the great set of ideas present in LISP and refers to it as the "greatest single programming language ever designed."
- The quote is something like this: "Certainly the greatest single language, along with Simula of the sixties, I think one with as many profound or more profound insights: Lisp". From the video The Computer "Revolution" Hasn't Happened Yet! (around 48:15) —Preceding unsigned comment added by 22.214.171.124 (talk) 01:46, 12 September 2007 (UTC)
I don't understand why the Tweak paragraph is here. It isn't directly relevant to Alan Kay.
- Actually, it is, but the paragraph left the relationship unclear. I have clarified the relationship of Kay and Raab, per your comment. Thank you. Jerryobject 03:35, 2 September 2007 (UTC)