Talk:Fifth generation computer
|WikiProject Computing||(Rated Start-class)|
|Sources for development of this article may be located at|
yonot disputing the conclusions, but some of the language in this article doesn't seem exactly NPOV. --Chronodm 12:21, 13 March 2006 (UTC)
The POV is slightly less that neutral. There are subtle cultural problems, and a degree of revisionism inherent in what has been written here. I can tell you that workstations were an afterthought in the period between 1982-1984. Architectural decisions in Japan at the time were more slant toward mainframes. The article is missing 2 of the 3 basic key ideas Moto-Oka and others took away with some the FCGS conference: 1) Data flow (Jack Dennis' ideas), 2) Prolog (which the article has, and I am not clear with Warren was at the meeting like Jack was), and I think,yeah man 3) Knowledge bases (Feigenbaum?). Those were starting ideas, but Moto-Oka's book should be examined for these. These were somewhat alarming at the time to USA-ians. Also the project should also not be confused with a similar Japanese superspeed project (Science article). I am not clear where I would begin editing minus Moto-Oka's book. --enm ~21:00, 8 Aug 2006 (UTC)
The first paragraph seems that the redirects and page title are wrong here:
- The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware).
This implies that this page should be called "Fifth Generation Computer Systems project" or something similiar. It seems to indicate that "Fifth generation computer" should point to "history of computing hardware". In fact, the other "nth generation computer" articles already redirect to that article. Any thoughts? -- ShinmaWa(talk) 00:07, 21 August 2006 (UTC)
The "failure" chapter reads: the internet enabled locally stored databases to become distributed; even simple research projects provided better real-world results in data mining, Google being a good example
This is an anachronism. Google was founded sixteen years after the start of the FGP. Are there any contemporary examples of simple research project outperforming the FPG in their own field?
- Someone edited this to omit the reference to Google (good), but it still reads:
- During its lifespan... the internet enabled locally stored databases to become distributed
- If I'm not mistaken, this didn't happen before 1992 (the end of the 5th Gen Project) either. The ARPAnet existed at that time, and I guess you could say the early Internet, but afaik there were no databases distributed over the internet. (I'm also unclear why distributed databases would render the 5th Gen project antiquated even if they had existed back then, but maybe there's a connection I'm missing.) At the very least, this claim needs a citation. Mcswell (talk) 23:26, 29 November 2009 (UTC)
Was the ability of other to distribute databases relevant if the FPG databases were anyway custom made for the task and not something you stored sales records in? Further down it reads:
The workstations had no appeal in a market where single-CPU systems could outrun them, and the entire concept was overtaken by the Internet.
Massive parallell computing on workstations connected to the internet didn't have its breakthrough until the release of Seti@Home in 1999. Before that, supercomputers with a large number of CPUs working in parallell were the norm. An example contemporary to the FGP was the "supercomputer" Cray-2. In light of that, "overtaken by the internet" is an anachronism. In any case, the AI software and related concepts got scrapped as well, nobody, as far as I know, ported similar systems to individual computers hooked up in a network in the late 80's-early 90's. EverGreg (talk) 14:53, 31 March 2008 (UTC)
- The problem was that no authoritative source could say what constituted a 4th generation of computers. No consistent consensus (despite 4GLs: "4th generation languages") was reached, so the Japanese jumped over this for their "5th" gen. Other generational measures existed such as Sid Fernbach's Class system of supercomputers (he edited a book), but no one ever definitively specified a "Class 7" machine in Sid's system. And Sid would challenge any one who did (to his death).
- The revisionist problem that I see with this web page now is the inclusion of the line on workstations. The Japanese began the project completely mainframe oriented. Why not? Most of their machines were IBM clones. In fact when you look at various competitive efforts outside the US (excepting parts of Europe such as the UK, France, and Germany (and the rest of Europe), most of the efforts were IBM mainframe clones. It was only after some time that they realized some of the mistakes they made. They, the Japanese, and now even the Chinese aren't straying too far architecturally from IBM (the Chinese in the IBM Wintel sense). One can deduce aspects, as a check, if you are knowledgeable about computing in the 1980s because of the IBM compatible consistent. 22.214.171.124 (talk) 22:32, 17 June 2009 (UTC)
Non Reliable Source
The reference "Carl Hewitt Inconsistency Robustness in Logic Programming ArXiv 2009" is not a refereed paper. Also the sentence is doubious where the reference is used. The sentence reads:
- In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages.
The problem is that concurrent constrait logic programming (CCLP) is rather based on an ask/tell fact paradigma than on a the read/write variable paradigma of guarded horn clauses (GHC). Also I find references that give a logical reading to both:
- From Concurrent Logic Programming to Concurrent Constraint
- Programming, Frank S. de Boar, Catuscia Palamidessi, 1993
- Essentials of Constraint Programming, Chapter 6
- Frühwirth, Thom; Slim Abdennadher, 2003.
That page has vastly degenerated.
The list of references no longer even cites the original 5th Gen report in 1982 in favor of a 2nd hand report by Shapiro (not to malign Ehud who is innocent of this). Is this the result of a supposedly neutral view point? It's clearly not written by Asians or probably even people people who have been to Asia (Japan specifically). I expect to see cited Jack Dennis who was responsible for convincing the Japanese to go with Dataflow architectures, Ed Feigenbaum for expert systems (another dead end), and a third American who name escapes me on the third topic which also escapes me. This is practically a worthless Wikipedia page. I'm ashamed to have to cite it. 126.96.36.199 (talk) 21:38, 13 November 2014 (UTC)