|This is the talk page for discussing improvements to the Parallel computing article.|
|Parallel computing has been listed as a level-4 vital article in Technology. If you can improve it, please do. This article has been rated as FA-Class.|
|Parallel computing is a featured article; it (or a previous version of it) has been identified as one of the best articles produced by the Wikipedia community. Even so, if you can update or improve it, please do so.|
|This article appeared on Wikipedia's Main Page as Today's featured article on March 18, 2009.|
|Current status: Featured article|
|WikiProject Computing / CompSci||(Rated FA-class, Top-importance)|
|WikiProject Spoken Wikipedia|
|Threads older than 60 days may be archived by.|
Should the paragraph about Application Checkpointing be in this article about parallel computing?
- Fault tolerance is a major (though often overlooked) part of parallel computing, and checkpointing is a major part of fault tolerance. So yes, it definitely belongs here. Raul654 (talk) 20:07, 8 March 2010 (UTC)
I came to the page to read the article and was also confused as to why checkpointing was there. It seems very out of place, and while fault tolerance may be important to parallelism, this isn't an article about fault tolerance mechanisms. It would be more logical to mention that parallelism has a strong need for fault tolerance and then link to other pages on the topic. 22.214.171.124 (talk) 01:27, 23 April 2011 (UTC)
Atempting new article: Distributed operating system
|Please be calm and civil when you make comments or when you present evidence, and avoid personal attacks. Please be patient as we work toward resolution of the issues in a peaceful, respectful manner.|
I am green as a freshly minted Franklin, never posted before (so be nice)
Graduate student at the University of Illinois at Urbana-Champaign
Semester project; regardless, always wanted to do something like this...
All initial work should be (majority) my effort
As a word to the wise is sufficient; please advise, rather than take first-hand action.
The article should (and will) be of substantial size; but is currently no more that scaffolding
The "bullet-points" are intended to outline the potential discussion, and will NOT be in the finished product
The snippet of text under each reference is from the reference itself, to display applicability
Direct copying of reference information WILL NOT be part of any section of this article
Again, this information is here to give an idea of the paper, without having to go and read it...
Article sections that are drafted so far are quite "wordy".... (yawn...)
Most of the prose at this point has about a 1.5X - 2.0X inflated over the anticipated final product
This is my writing style, which has a natural evolution, through iteration
Complex -> confused -> constrained -> coherent -> concise (now, if it only took 5 iterations???)
Again, thank you in advance for you patience and understanding
I look forward to working with you guys...
Project Location: Distributed operating system
Project Discussion: Talk: Distributed operating system
JLSjr (talk) 01:31, 20 March 2010 (UTC)
Spoken Wikipedia recording
What a pleasant surprise. A Wikipedia article on advanced computing that is actually in good shape. The article structure is (surprise) logical, and I see no major errors in it. But the sub-articles it points to are often low quality, e.g. Automatic parallelization, Application checkpointing, etc.
The hardware aspects are handled better here than the software issues, however. The Algorithmic methods section can do with a serious rework.
The template used here called programming paradigms, is however, in hopeless shape and I will remove that given that it is a sad spot on an otherwise nice article. History2007 (talk) 22:40, 8 February 2012 (UTC)
Babbage and parallelism
"The origins of true (MIMD) parallelism go back to Federico Luigi, Conte Menabrea and his "Sketch of the Analytic Engine Invented by Charles Babbage"."
Not that I can see. This single mention refers to a system that does not appear in any other work, did not appear in Babbage's designs, and appears to be nothing more than "it would be nice if..." Well of course it would be. Unless someone has a much better reference, one that suggests how this was to work, I remain highly skeptical that the passage is correct in any way. Babbage's design did have parallelism in the ALU (which is all it was) but that is not parallel computing in the modern sense of the term. Maury Markowitz (talk) 14:25, 25 February 2015 (UTC)
Dear Maury Markowitz,
Forgive me for reverting a recent edit you made to the parallel computing article.
You are right that Babbage's machine had a parallel ALU, but does not have parallel instructions or operands and so does not meet the modern definition of the term "parallel computing".
However, at least one source says "The earliest reference to parallelism in computer design is thought to be in General L. F. Menabrea's publication ... It does not appear that this ability to perform parallel operation was included in the final design of Babbage's calculating engine" -- Hockney and Jesshope, p. 8. (Are they referring to the phrase "give several results at the same time" in (Augusta's translation of) Menabrea's article?)
So my understanding is that source says that the modern idea of parallel computing does go back at least to Menabrea's article, even though the idea of parallel computing was only a brief tangent in Menabrea's article whose main topic was a machine that does not meet the modern definition of parallel computing.
Perhaps that source is wrong. Can we find any sources that disagree? The first paragraph of the WP:VERIFY policy seems to encourage presenting what the various sources say, even when it is obvious that some of them are wrong. (Like many other aspects of Wikipedia, that aspect of "WP:VERIFY" strikes me as crazy at first, but then months later I start to think it's a good idea).
The main problem I have with that sentence is that implies that only MIMD qualifies as "true parallelism". So if systolic arrays (MISD) and the machines from MasPar and Thinking Machines Corporation (SIMD) don't qualify as true parallel computing, but they are not sequential computing (SISD) either, then what are they? Is the "MIMD" part of the sentence supported by any sources? --DavidCary (talk) 07:04, 26 February 2015 (UTC)
- The "idea" may indeed date back to Menabrea's article, in the same way that flying to the Moon dates to Lucian's 79BC story about a sun-moon war. I think we do the reader a major disservice if we suggest that Menabrea musings were any more serious than Lucian's. Typically I handle these sorts of claims in this fashion...
- "Menabrea's article on Babbage's Analytical Engine contains a passage musing about the potential performance improvements that might be achieved if the machine was able to perform calculations on several numbers at the same time. This appears to be the first historical mention of the concept of computing parallelism, although Menabrea does not explain how it might be achieved, and Babbage's designs did not include any sort of functionality along these lines."
- That statement is factually true and clearly explains the nature of the post. Frankly, I think this sort of trivia is precisely the sort of thing we should expunge from the Wiki (otherwise we'd have mentions of Tesla in every article) but if you think it's worthwhile to add, lets do so in a form that makes it clear. Maury Markowitz (talk) 14:44, 26 February 2015 (UTC)