Talk:Claude Shannon

From Wikipedia, the free encyclopedia
Jump to: navigation, search
          This article is of interest to the following WikiProjects:
WikiProject Biography / Science and Academia (Rated B-class)
WikiProject icon This article is within the scope of WikiProject Biography, a collaborative effort to create, develop and organize Wikipedia's articles about people. All interested editors are invited to join the project and contribute to the discussion. For instructions on how to use this banner, please refer to the documentation.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
Taskforce icon
This article is supported by the science and academia work group (marked as Mid-importance).
 
WikiProject United States (Rated B-class, Low-importance)
WikiProject icon This article is within the scope of WikiProject United States, a collaborative effort to improve the coverage of topics relating to the United States of America on Wikipedia. If you would like to participate, please visit the project page, where you can join the ongoing discussions.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 
WikiProject Cryptography / Computer science  (Rated B-class, Top-importance)
WikiProject icon This article is within the scope of WikiProject Cryptography, a collaborative effort to improve the coverage of Cryptography on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the quality scale.
 Top  This article has been rated as Top-importance on the importance scale.
Taskforce icon
This article is supported by WikiProject Computer science (marked as Top-importance).
 
WikiProject Telecommunications  
WikiProject icon This article is within the scope of WikiProject Telecommunications, a collaborative effort to improve the coverage of Telecommunications on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
 ???  This article has not yet received a rating on the quality scale.
 ???  This article has not yet received a rating on the importance scale.
 
WikiProject Systems (Rated B-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Systems, which collaborates on articles related to systems and systems science.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.
Taskforce icon
This article is within the field of Systems theory.
 
WikiProject Computer science (Rated B-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Computer science, a collaborative effort to improve the coverage of Computer science related articles on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.
 
WikiProject Cycling (Rated B-class)
WikiProject icon This article is within the scope of WikiProject Cycling, a collaborative effort to improve the coverage of cycling on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 

Shannon's theorem vs. the Shannon-Hartley law[edit]

As far as I know, these two are different things. The latter law is only for a fixed-bandwidth channel with a continuous input and output alphabets, while the former is more general (for example, it includes discrete-alphabet channels, where the notion of bandwidth itself is meaningless). If there are no objections, I will add a Shannon's theorem page and fix the inconsistencies on this page. Julius.kusuma 17:24, 25 Oct 2004 (UTC)

"where the notion of bandwidth itself is meaningless." Not true. Here is a simple example: binary phase shift keying (BPSK). In BPSK, one bit of information is sent at a time, with each bit lasting T seconds. (Hence the bit rate, BR = 1/T). Let "A" be some positive voltage. Then to send a logic "1", an A is sent, and to send a logic "0", a -A is sent. These are modulated onto a carrier cosine(ωt), which sets the center frequency of the signal.
If the 1's and 0's are sent in a random-looking pattern (as is true in the practical case that maximizes the information sent), two different kinds of bandwidths can be defined from the power spectrum. If the bandwidth of 2xBR is properly chosen, then this contains 95 percent of the power, and the rest is disposed of by filtering at the transmitter. If the bandwidth of BR is properly chosen, then this contains 90 percent of the power, and the rest is disposed of by filtering at the transmitter.
This can be called the "bit rate bandwidth", and in practical communications engineering, this is the one that is chosen. To give you a simple example, if the bit rate is 100,000 bits per second, then the bandwidth (signal processing) is 100,000 hertz, or in other words, 100 kilohertz. This is a real bandwidth that can be seen and measured in the electronics laboratory, and not that kind of fictitious "bandwidth" that computer scientists use. (They do not know anything about filtering, power spectra, and signal processing, anyway, hence they misuse the word.)
In summary, the concept of bandwidth is a very meaningful one in discrete-signal channels, and we can work out the bandwidths of such discrete-signals communication systems as BPSK, QPSK, 8-PSK, FSK, MFSK, 16-QAM, 64-QAM and dozens more discrete-signaling systems - just to hit you with some electronics engineering jargon that I am not going to explain here. Such things you can look up on the Internet if "inquiring minds want to know." Try Phase Shift Keying in the Wikipedia. I will mention that the bandwidth of QPSK is 1/2 of that of BPSK, and this is why QPSK is so common.
98.67.107.241 (talk) 12:35, 2 September 2012 (UTC)

The relationship between Shannon's proof of the isomorphism between Boolean algebra and relay circuits[edit]

Regarding the NPOV link which I inserted in the main article:
Some silly idiot reverted my edits in which I made clear that Shannon was working independently of the early electronic computer pioneers. That is, they grasped the idea that relay circuits could be used to solve mathematical problems, but put them together in a rather ad hoc fashion, while Shannon went at in the opposite direction---he grasped that Boolean algebra could be used to simplify relay circuits, and then (in the same paper) reached the idea that relay circuits could be used to solve Boolean algebra problems, but never got around to actually exploring the idea in hardware. I researched and wrote an eleven-page paper on Shannon's 1938 paper for an undergraduate history class, so I actually know what I am talking about.
There were three major pioneers: Atanasoff, Zuse, and Stibitz; while Aiken is disputed because his contemporaneous machines were electromechanical (a halfway step from Bush's differential analyzer), not fully electronic. Atanasoff and Zuse are on the record as explicitly admitting that they did not know of Shannon's work at the time they built their machines. See John V. Atanasoff, “Advent of Electronic Digital Computing,” Annals of the History of Computing 6, no. 3 (July 1984): 241, and Konrad Zuse, ”The Outline of a Computer Development from Mechanics to Electronics,” in The Origins of Digital Computers: Selected Papers, 3rd edition, ed. Brian Randell (Berlin: Springer-Verlag, 1982), 178.
Although I couldn't get primary sources for Stibitz (he was on the East Coast and I am on the West Coast), there are two secondary sources that state that Stibitz did not know of Shannon's work when he was putting together his first machine. They are: James R. Beniger, The Control Revolution: Technological and Economic Origins of the Information Society (Cambridge: Harvard University Press, 1986), 406, and G. Harry Stine, The Untold Story of the Computer Revolution: Bits, Bytes, Bauds, and Brains (New York: Arbor House, 1985), 93.
If that idiot (who was too cowardly to login or to create an account) doesn't respond within a few days with cites to refute my cites, I'm going to put my edits back in!
Coolcaesar 10:58, 3 Jan 2005 (UTC)

Thanks for going to the effort of posting the sources. One thing, though. I'd encourage you to avoid making personal attacks ("Silly idiot", "cowardly", and so forth). See Wikipedia:No personal attacks and Wikipedia:Civility. — Matt Crypto 11:45, 3 Jan 2005 (UTC)
Give us a break about "no personal attacks" and all of that. There are some people who make changes to Wikipedia articles that are simply "beyond the pale". When they are speaking as knuckleheads, they need to be addressed as knuckleheads. There is no way around that. Also, they wish to argue with people who have bachelor's degrees and graduate degrees in the subjects at hand -- but then we find that those argumentative types often haven't even graduated from high school, yet. Some of them never will.
98.67.107.241 (talk) 13:24, 2 September 2012 (UTC)
Of course they (Zuse, Aiken, etc.) are "on the record" as claiming they didn't know about Shannon's work, which was widely known and published in the Transactions [of the] American Institute of Electrical Engineers of 1938. It's lucky for the others that Shannon was such a nice (and brilliant) guy that he was generous with his ideas and didn't battle for credits.
I don't think that you realize that the journals of the AIEE and the IRE during the 1930s and earlier were not nearly widely circulated and widely read as they became after World War II -- and continuing into the time of the IEEE. That was a big difference between then and now. Furthermore, people didn't go to the technical libraries of dozens of campuses of the Univ. of California, of Texas, of Wisconsin, of North Carolina, of Penn State, of SUNY, and several more universities in Florida to read journals because those schools did not exist yet! You would have had a very hard time reading technical journals at the University of Texas at Dallas or the University of California at San Diego, for example. It was very easy for something to be published in an American journal and not very much noticed, especially in foreign countries.
98.67.107.241 (talk) 13:24, 2 September 2012 (UTC)
Isn't Zuse an interesting guy: one of his patents was TWICE rejected by the German government, both before and after World War II. He alleged that Z1 and Z2 were destroyed during an Allied air raid, so history has no physical trace of them. He would later claim to have invented the Plaenkalkul, the "first" computer language, during WW II -- when in fact, it was never published until the 1970s, and it was never actually implemented in his lifetime!
You cite, as evidence, published literature by Atanasoff (about Atanasoff), and literature by Zuse (about Zuse). Your history professor should have warned you (bold emphasis mine).
"...all fiction may be autobiography, but all autobiography is of course fiction." -- Shirley Abbott
"Autobiography is probably the most respectable form of lying." -- Humphrey Carpenter
"All autobiography is self-indulgent." -- Daphne Du Maurier
"An autobiography is only to be trusted when it reveals something disgraceful. A man who gives a good account of himself is probably lying, since any life when viewed from the inside is simply a series of defeats." -- George Orwell (a man too "cowardly", in your eyes, to publish under his own name of Eric Blair)
"We are always more anxious to be distinguished for a talent which we do not possess, than to be praised for the fifteen which we do possess." -- Mark Twain (a man too "cowardly", in your eyes, to publish under his own name of Samuel Clemens)
"My grandfather once told me that there were two kinds of people: those who do the work and those who take the credit. He told me to try to be in the first group; there was much less competition. " -- Indira Gandhi (whose surname has an interesting history as well, evidence of cowardism in your eye?)
If your history professor failed to critique you about the pitfalls of relying on autobiographies, demand a refund and transfer out of that college!
Well, I already graduated from that college with a bachelor's degree in history, and that college happens to be the most prestigious public university in the United States. Also, you conflate my dislike of anonymity with a dislike of pseudonyms. How childish. I have nothing against pseudonyms. As you should know, some of the greatest documents in history, like the Federalist Papers, were published under pseudonyms. Also, the convoluted history of the Gandhi family is more about political opportunism, which again is distinguishable.
The point is, I respect those who choose to adopt new names by choice, or through marriage, for either political or personal reasons, but I have a strong dislike of persons who are so cowardly as to not even dignify their writing with an identifier.
Now, turning to the heart of the issue, as a casual student of philosophy, I am painfully aware of the epistemological risks that arise from relying on any source, especially one where the author was probably aware that he was writing for an audience of historians, and thus there is the risk that his writing was self-serving.
However, you fail to answer the obvious question: motive. Why would Zuse or Atanasoff (or even Stibitz) do such a thing? Certainly, you are correct that Zuse has some credibility problems, although his work after the war does establish him to be a computer scientist of decent ability.
Still, Atanasoff was well-known during his lifetime for being ferociously litigious about who should be credited as the inventor of the electronic computer, yet in that 1984 paper, he conceded his weakness in Boolean algebra: "I did not recognize its application to my undertaking, and I obtained my result by trial, at first, and then by a kind of cognition."
If he was as self-serving as you seem to believe, then why would he admit his ignorance of Shannon's work when he could so easily have claimed to have reached the same result on his own?
The other problem I have with your approach to knowledge is that you seem to infer that just because Shannon's paper was published in the Transactions of the AIEE, it instantly became well-known to the entire electrical engineering community. Apparently you've never worked in electrical engineering, or computer science, or any other technical field. There are literally hundreds of new articles published every month, and it is impossible for every scientist and engineer to stay informed about all of them.
If you read Vannevar Bush's essay, "As We May Think," or H.G. Wells' book, "World Brain," you will immediately realize how daunting the task seemed to the intellectuals of the 1930s, long before computerized article databases became available. Both the memex and the World Brain were proposed solutions to the problem of information overload (although the phrase itself would not be coined until 1970, by Alvin Toffler). And before you point out that "As You May Think" was published in 1945, keep in mind that the problem was so evident by the early 1930s that Vannevar Bush was already fantasizing about a proto-memex at the time. See Bush, Vannevar, “The Inscrutable Thirties,” Technology Review 35, no. 4 (January 1933): 123-148.
I will concede that one can safely infer that most of the electrical engineering community probably was aware of Shannon's paper by 1940, the year he won the Alfred Noble Prize (??), but the year that counts is 1938, the year the paper was published and the same year that Stibitz and Atanasoff had already begun building their first devices. Prior to 1940, Shannon was not yet an intellectual celebrity, and it is unreasonable to draw the inference that other intellectuals were desperate to immediately read, absorb, and marvel at anything with his name on it.
Also, in your biased attempt to write a Whiggish hagiography of Shannon (and if you don't understand why "Whiggish" is pejorative, then you are way out of your depth when it comes to history), you fail to realize that Shannon was not the first to come up with the isomorphism between relay circuits and Boolean algebra. As Alice Mary Hilton puts it: "The analogy between switching circuits and the propositional calculus occurred to scientists in all parts of the world. It was suggested by P. Erénfést in his review of Couturat’s Algébra in Logiki in 1910. The Russian physicist V. I. Sestakov worked out details in 1934-35, but did not publish them until 1941. Independently, the same views were published by Akira Nakasima and Masao Hanzawa in a Japanese journal, in 1936." See: Alice Mary Hilton, Logic, Computing Machines, and Automation (Washington, DC: Spartan Books, 1963), 230. Shannon was not the first to publish; he was only the first to get discovered.
Finally, here's one last quote, to back up my basic point: "Boolean algebra was something less than a major influence in the invention and design of early electronic computers...the 1950s were well under way before the algebra was used at all at some of the major pioneering electronic computer organizations." R. K. Richards, Digital Design (New York: John Wiley & Sons, 1971)
I hate to engage in ad hominem arguments, but since you started it, I'm not afraid to fire back. I'm beginning to think the reason that you don't want to identify yourself is that you don't want anything traceable to you, so people who know you can never find out you're an immature troll. You're too embarrassed to admit that unlike me, you have not spent hours in one of the largest libraries in the world, plowing through every issue of the IEEE Annals of the History of Computing, and selected issues from about a dozen other journals (including Transactions of the AIEE), and over a hundred different books, to winnow out the complicated truth about Shannon's contributions. Professional historians of science and technology (and their devoted students) know that the true story of innovation is never a simple straight line. Finally, if you don't know what STS, SSK, or SCOT stand for, then you have a lot of reading to do.

--Coolcaesar 07:53, 19 Jan 2005 (UTC)

I am really fond of all that you have written above. I need to make just one correction. Claude Shannon NEVER won a Nobel Prize, nor did any of his associates, nor has anyone in the communication theory and information theory part of electrical engineering. Some have won the Nobel Prize in Physics for such things as transistors, integrated circuits, and radio astronomy - especially in the areas of new kinds of radio receivers and their discoveries in the areas of pulsars and the cosmic radio background.
98.67.107.241 (talk) 13:24, 2 September 2012 (UTC)

Title[edit]

Why was this renamed from Claude Shannon? -MagnaMopus 17:43, 18 January 2006 (UTC)

Legacy and Tributes[edit]

In my view, remarks such as the 'greatest scientist of the 20th century' detract from the quality of this article. The tributes section needs to be edited. Cryptonaut 04:51, 23 April 2006 (UTC)

Article title[edit]

Is there some reason why this article is currently titled "Claude Elwood Shannon" instead of "Claude Shannon"? Wikipedia title guidelines for people use the most commonly known name that doesn't require disambiguation, preferring (for Western-style names) only "Given-name Surname" without a compelling reasons otherwise. From my comp-sci and engineering studies, I seem to recall "Claude Shannon" as being the usual form, and there are no other Claude Shannons currently in Wikipedia, let alone one with such global prominence. ~ Jeff Q (talk) 17:32, 31 January 2007 (UTC)

Oops, I missed MagnaMopus's same, unanswered question above. If no one responds shortly, I will assume there is no good reason for this, and will ask the sysops to move this article (as Claude Shannon has a non-trivial edit history already).
Jeff Q (talk) 17:34, 31 January 2007 (UTC)
I concur. I came to here to make the very comments made above.
LambiamTalk 05:16, 16 April 2007 (UTC)
Sounds good to me. Dicklyon 05:25, 16 April 2007 (UTC)
I've put in a move request.[1]  --LambiamTalk 05:29, 16 April 2007 (UTC)
So did I. But I put it in the wrong section, so I reverted it. Thanks. Dicklyon 05:32, 16 April 2007 (UTC)
Ladies and Gentlemen writing biographies, please do not forget to mention in which particular discipline the candidate obtained his or her Ph.D. In Shannon's case, I was having the idea that he did his Ph.D. in electrical engineering, but a friend of mine suggested that he did his Ph.D. actually in Mathematics. I checked this article with full of hope that it would clarify the matter, but to my disappointment, there is no mention of it. In fact, Shannon did his Ph.D. in Mathematics. Thanks. —Preceding unsigned comment added by 76.67.191.137 (talk) 20:28, 21 July 2009 (UTC)
Comment from a friend on Shannon's photograph. His photo on the Wikipedia page exhibits vague diagonal noise. This is an instance of a scanner subsampling the photo, probably from a half-toned magazine illustration scanned at too low a resolution. The aliasing noise was described in the Nyquist-Shannon Sampling Theorem. Was this meant to be ironic?Timbabwe (talk) 21:52, 27 October 2009 (UTC)
No, of course not. Please do not look for specters behind every tree.
98.67.107.241 (talk) 15:34, 2 September 2012 (UTC)

Proper Noun?[edit]

OK, regarding the sudden (and, I believe, undiscussed) change to the capitalization of "information theory", I'm remembering things like:

"In 1948, Claude Shannon of the Bell Telephone Laboratories published two papers that established a new field of science, Information Theory..."

A bit-for-bit quote from:

Linguistic Models and Information Theory, Pages 431–438
Thomas, Owen
Languages in Contact and Contrast
Essays in Contact Linguistics
Edited by Ivir, Vladimir, and Kalogjera, Damir
Originally published 1991

Frankly, it's not that big a deal how we spell it but it certainly is a proper noun by any reasonable definition, because it names a specific thing. That's why people use "Grand Unified Theory", or any of dozens of others. If the consensus here is to use lower case from the outset of the article, fine--as long as that's the consensus. It just seems to work better when it's discussed rather than slapped into place with a cryptic and less-than-explanatory Edit Summary.

Anyone have any thoughts?
UncleBubba T @ C ) 02:54, 17 April 2011 (UTC)

Names of specific things are not proper nouns. Where'd you get that idea? See books. Dicklyon (talk) 03:08, 17 April 2011 (UTC)
<attempt at humor> In what language? </attempt at humor>
My understanding is that it's pretty simple: A common noun describes/names a person, place or thing. A proper noun names a specific person, place or thing. From the top three hits on a Google search for "proper noun" come (and yeah, one of 'em is WP, so they ain't proper sources, per se...):
  • Nouns name people, places, and things. Every noun can further be classified as common or proper. A proper noun has two distinctive features: 1) it will name a specific [usually a one-of-a-kind] item, and 2) it will begin with a capital letter no matter where it occurs in a sentence.[2]
  • A proper noun or proper name is a noun representing a unique entity (such as London, Jupiter, John Hunter, or Toyota), as distinguished from a common noun, which describes a class of entities (such as city, planet, person, or corporation).[1] Proper nouns are not normally preceded by an article or other limiting modifier (such as any or some), and are used to denote a particular person, place, or object without regard to any descriptive meaning the word or phrase may have[2] (for example, a town called "Newtown" may be, but does not necessarily have to be, a new [recently built] town).[3]
  • What is a proper noun? Definition: A proper noun is a noun that is the name of a specific individual, place, or object. Examples (English): Joseph, New York City, Empire State Building [4]
And yes, there is a helluva lot of variance out there, even in the Land of Books. Personally, though, I'd rather look at The Elements of Style (or the WP:MOS) than a list of publications, but that's just me... — UncleBubba T @ C ) 03:32, 17 April 2011 (UTC)
Well, I agree with the definition that "A proper noun is a noun that is the name of a specific individual, place, or object." However, I don't think there's a valid converse. The name of a specific individual, place, or object is not necessarily a proper noun. Either that, or one has to have a very fine line on how to decide what is a "specific object"; perhaps information theory is not one. I agree with you on respecting style guides, and the MOS in particular; I think it says that on wikipedia we don't over-capitalize for emphasis the way many publications do (or I seem to recall something to that effect). Dicklyon (talk) 04:00, 17 April 2011 (UTC)
The section on capitalization does say "Philosophies, theories, movements, and doctrines do not begin with a capital letter unless the name derives from a proper noun..." Dicklyon (talk) 04:18, 17 April 2011 (UTC)
Argue about it as you wish if you are stubborn, but these are all proper nouns, especially because of their scientific and historical importance: Atomic Theory (the Theory of the Atom), Euclidean Geometry, Newtonian Mechanics, the Theory of Universal Gravitation, Special Relativity, General Relativity, Information Theory, Electromagnetic Theory, Quantum Mechanics, the Theory of Evolution, and Quantum Electrodynamics.
In mathematics, nearly all things that were named for people are capitalized, e.g. the Jacobian, the Hessian, and Taylor series. There is only one exception: "abelian" as in abelian group. Usually things that are not named for people are lower case, e.g. differential calculus and complex analysis. However, Information Theory is not strictly mathematical, just like General Relativity is not strictly mathematical.
98.67.107.241 (talk) 16:04, 2 September 2012 (UTC)

Dating[edit]

Re: A Symbolic Analysis of Relay and Switching Circuits I note that "Mass. Inst. Tech DEC 20 1940 Library" is the stamp appearing on the copy of this paper in the MIT Library. The paper was "Submitted in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE from the Massachusetts Institute of Technology 1940". The date typed beneath the "Signature of Author" reads "Department of Electrical Engineering, August 10, 1937". The dating in the discussion above and the article itself have caused me some confusion.

You need to understand that back in the early 1940s and before, things did not get done instantly even at M.I.T. Shannon's thesis for his M.S. in electrical engineering was doubtless completed and accepted by that department in August 1937. Then, nobody got around to putting it into the M.I.T. Library until December 1940.
Furthermore, back in those decades before 1950 (or later) putting master's theses into the school's library (and I mean anywhere) was not an automatic process. What we had here was a case of the department head, or the dean of engineering, or one of their assistants deciding, "This thesis is an important one, and a copy of it needs to be put into the library." Then unless they had a spare carbon copy of it, they had to PAY someone to type up another copy of it. You need to understand that back in 1940, there was no such thing as a Xerox machine, or a dot-matrix printer, or a laser printer where they could make more copies of something at will. Thing concerning documents were more difficult and more costly back then. You need to learn about the technical history of the 20th Century.
98.67.107.241 (talk) 16:29, 2 September 2012 (UTC)

Purpose[edit]

"It is also possible to use the analogy between Booleian[sic] algebra and relay circuits in the opposite direction, i.e., to represent logical relations by means of electric circuits. Some interesting results have been obtained along this line, but are of no importance here." The purpose thus defined on page 16 (original numbering) continues and proofs are given throughout, until near the end there is a slight change in direction toward the practical. Section V (p51) Illustrative Examples "The examples are intended more to show the versatility of relay and switching circuits and to illustrate the use of the calculus in actual problems than to describe practical devices." However he does not shy away from the belief in its practical value when a moment later he continues "In fact, any operation that can be completely described to the required accuracy (if numerical) in a finite number of steps using the words "if," "or," "and," etc. (see Table II) can be done automatically with relays."

Shannon's concluding paragraph is prophetic "As to the practicability of such a device, it might be said that J.P. Kulik spent 20 years in constructing a table of primes up to 100,000,000 and when finished it was found to contain so many errors that it was not worth publishing. The machine described here could probably be made to handle 5 number per second so that the table would require only about 2 months to construct."

The copy to which I refer is publicly available from MIT: http://hdl.handle.net/1721.1/11173 Pendare (talk) 17:25, 27 April 2011 (UTC)

Plagarism[edit]

A good chunk of the early sections seems to be plagarised from ATT Lab's (sometimes referred to as Shannon Labs) website, http://www2.research.att.com/~njas/doc/shannonbio.html

I don't currently have the time to fix it, but thought it might be helpful to point out as a TODO for anyone with some spare time. — Preceding unsigned comment added by 68.180.2.171 (talk) 04:58, 9 February 2012 (UTC)

Doctoral Students[edit]

The list of doctoral students included a link to Heinrich Ernst, which I believe is not the correct reference. I changed it to "Heinrich Arnold Ernst", according to the following Ph.D. thesis. http://dspace.mit.edu/handle/1721.1/15735 . The thesis contains a bibiographical note, which does not match Heinrich Ernst. Hence, they were two different men.
Duzian (talk) 12:12, 16 July 2012 (UTC)

It has been claimed that this was the most important master's thesis of all time.[3][edit]

References to "Poundstone, William (2005). Fortune's Formula : The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street. Hill & Wang. ISBN 978-0-8090-4599-0." While I've not read Fortune's Formula, I highly doubt that it is a credible source on what 'the most important thesis of all time is'. ~~Jeanne — Preceding unsigned comment added by 148.88.244.175 (talk) 16:17, 25 November 2013 (UTC)

I deleted that line. Although formally it may be correct to state in an article "It has been claimed that ..." if someone claimed something, anyone can claim anything. So, it may be verifiable that person X made that claim. But then, because of its suggestivity in the article: is that claim true? Is there a list somewhere of the grand and official World's Most Important Master's Theses Ever, and is CS number one on it? The formulation does not sound very neutral to me, it merely seems to serve a boosterist viewpoint. I think from the contents of the article it becomes clear to the readership already without that particular line, that he did some very important work, and whether or not it was claimed by someone somewhere sometime it was the most important Master's thesis of all time (in the universe?), is hardly relevant in this regard, methinks. Poepkop (talk) 13:21, 8 February 2014 (UTC)