# Talk:Moore's law/Archive 4

## The iphone is not a computer

it's a Phone. If you wanted to show something that has a processor and memory, show a Watch. Those are computers nowadays in that sense since often have processors and memory and program. --62.1.89.9 (talk) 21:10, 11 January 2011 (UTC)

Disagree. All three are computers. And note that the numbers for 2010 (such as speed/price or capacity/price) will probably work out the same no matter which devices you compare. ---- CharlesGillingham (talk) 06:58, 12 January 2011 (UTC)
Absolutely and completely disagree. The iPhone contains a keyboard, processor, memory and display. You could argue that it does not have a printer, but I think I have seen printers with iPhone interfaces. Next question.
Right; like an iPod or an iPad or any ohter smartphone, it's a little computer with some special I/O devices. Dicklyon (talk) 22:49, 11 March 2011 (UTC)

How much of the value in a gizmo is in the software? If it's more than 50 percent then the gizmo is a computer. The Lockheed Martin F-35 Lightning II would be a good example of a computer. Hcobb (talk) 23:21, 11 March 2011 (UTC)

## Problem with refs 5 and 6

This latest edit changes wording in a way that I wanted to check with the cited sources, numbers 5 and 6; ref 5 says nothing about Moore's law; ref 6 links to an article that isn't the title or author that it says it is. Does anyone have some memory or history of these, or motivation to help clean them up? Dicklyon (talk) 22:47, 11 March 2011 (UTC)

It was broken when first added by User:CharlesGillingham in this diff. There are two useful sources here: the one cited, and the one linked. But mixing them up like this is something will take some work to sort out. Dicklyon (talk) 01:53, 12 March 2011 (UTC)

The first source, from The Atlantic, discusses how digital electronics have changed the world economy over the last twenty years, even in "Old Economy" sectors such as Oil. He writes:

In the 1980s Old Economy businesses tended to waste much of what they spent on computers and software. Companies in traditional industries would drop a PC on every desk and declare themselves computerized; they would buy spreadsheet programs and word-processing software and networking equipment that as often as not just substituted new frustrations for old ones. This began to change, however, as software and hardware grew in power, and as companies began learning how to use them not just as conveniences or crutches but to change the nature of the job. At first the impact, like a misty drizzle, was too small to show up in the national economic statistics. However, each innovation enabled other innovations, none of them revolutionary but all of them combining in an accelerating cascade. By the second half of the 1990s the aggregate effect on productivity became large enough to register in the national accounts, and the line between the New Economy and the Old Economy began to blur.

According to Rauch, the key factor driving global productivity numbers is Moore's law, although he does not name it specifically, he describes it, writing:

. In the three decades since 1970 the power of microprocessors increased by a factor of 7,000. Computing chores that took a week in the early 1970s now take a minute. According to the Federal Reserve Bank of Dallas, the cost of storing one megabit of information, or enough for a 320-page book, fell from more than $5,000 in 1975 to seventeen cents in 1999. So this source verifies the statement about "the utility of digital electronics in every segment of the world economy". The other (two?) sources were not added by me, although I might have moved them around accidentally. Sorry I didn't check the links more carefully. ---- CharlesGillingham (talk) 18:08, 14 March 2011 (UTC) At any rate, I've removed the other sources, since the linked source doesn't seem to be directly on topic. The other source may have been, I don't know. The Rauch article verifies the statement "[The improvement in digital electronics] has dramatically enhanced the utility of digital electronics in nearly every segment of the world economy. Moore's law describes a driving force of technological and social change in the late 20th and early 21st centuries." ---- CharlesGillingham (talk) 18:25, 14 March 2011 (UTC) I put the Rauch ref back to verify a statement about electronics improvements, not about Moore's law, and restored the other as two, appropriately sorted out. Dicklyon (talk) 03:24, 15 March 2011 (UTC) ## Updated Image I've made a new version of the transistor count image used in this article. Please visit my talk page to make any suggestions before it is added to Moore's law and Transistor count. -- Wgsimon (talk) 15:15, 13 May 2011 (UTC) ## Removing photo with iPhone A photograph showing an iPhone and an osborne isn't appropriate regardless of its caption and comparison of clock speeds is not related to Moore's Law. For each of these reasons I intend to remove the photo if it is not removed by an Admin. The photo appears to be nothing other than advertising. Here is why: Moore's law refers to a doubling of discrete active components in a given area of substrate, not to a change in performance -- as a different in clock speeds may imply. (Personally, I hope that performance improves too but it isn't a guarantee.) Clock speed isn't relevant to performance anyway, unless the architecture is the same. Device architectures and the architecture internal to the CPU used by each aren't even as close as "apples and oranges". 71.211.233.155 (talk) 22:14, 14 May 2011 (UTC) Well, it's not an issue for an admin; anyone can remove it. But I'd question your rationale. I'd say instead that it would be good to compare the processors in those devices, in terms of year made and number of transistors; and it would be good to connect device size (Moore's law) to performance (a happy beneficiary of Moore's law). When Hoeneisen and Mead first analyzed MOS scaling, about the same time that Mead made up the term Moore's law for Moore's observation, it was apparent that performance would be getting better on an exponential curve, pretty much along with density. These things are tightly linked, not a coincidental relationship. Dicklyon (talk) 22:32, 14 May 2011 (UTC) ## Very Nice I have been reading a lot of very poorly written and overly complex and confusing articles on Wiki lately. I was very happy to read this article and wanted to thank those who have worked together on it. — Preceding unsigned comment added by Mantion (talkcontribs) 19:32, 17 September 2011 (UTC) ## 60% correct, but possible copyvio? While looking for a reference that verifes this number, I found the actual magazine cited, and the number is in fact 60%. However, what is written appears to be a copyvio of that source. Below are the two texts, and what Wikipedia has different is in bold, which isn't much. Wikipedia article: Since then, technological change has clearly slowed down. In recent times, every new year allowed mankind to carry out roughly 60% of the computations that could have possibly been executed by all existing general-purpose computers before that year. Science Magazine: Since then, the technological progress has slowed. In recent times, every new year allows humankind to carry out roughly 60% of the computations that could have possibly been executed by all existing general-purpose computers before that year. I'm not sure whether to remove it outright or just rewrite it, but hopefully someone more knowledgable in this subject can make that call? - SudoGhost 05:10, 18 October 2011 (UTC) Just in case someone looks for when the information was inserted, here is the diff. It was inserted on 17 September 2011, and the magazine is dated 1 April 2011. - SudoGhost 05:23, 18 October 2011 (UTC) ## Osborne Executive vs iPhone In the caption for the image of the Osborne Executive (near the top of the article), it is claimed that the iPhone has 100 times the processing power of the Osborne Executive. Does anyone have a source for this claim? Our articles on each show that the iPhone's clock rate is about 100 times higher than the Osborne Executive's but, as we all know, there is far more to processing power than clock rate. --Tango (talk) 01:26, 3 August 2010 (UTC) "An Osborne Executive portable computer, from 1982, and an iPhone, released 2007 (iPhone 3G in picture). The Executive weighs 100 times as much, has nearly 500 times the volume, cost 10 times as much, and has a 100th the clock frequency of the iPhone." The Osburne executive. It weighed 47lbs. iPhone is 4.3oz or 0.3Lbs or 156ths its weight. Its Z80 Microprocessor was fixed at 4Mhz, the iPhone is variable from 1Ghz, or about 250 faster. Would someone like to revise this caption? —Preceding unsigned comment added by 69.232.205.64 (talk) 10:07, 1 March 2011 (UTC) Let me see if I can track down a Mips rating for the Zilog 80. Mips rating or Vax-MIPS was ">Z80 2 Mhz 1976 91 0.057" and the closest I can come for the A9 is "4000 Dhrystone MIPS" Which makes it about 10,000 more powerfull. ( and at 250 times faster, about 40 times more efficent ] [1] of course, YMMV, Your milage may vary. —Preceding unsigned comment added by 69.232.205.64 (talk) 23:55, 1 March 2011 (UTC) My issue with this caption is the "cost ten times as much." I'm wondering if whoever wrote the caption used the "subsidized" price of the iPhone (with the carrier eating a large portion of the price to induce you to sign a contract) instead of the "real" price of about$600. Four times (around $2400 for an Executive) I'd believe; ten ($6000) is ludicrous. I suspected as much when I started writing this, and have since confirmed on the Osborne Executive page that the initial price was $2495, so I'm going ahead and making the change. 173.196.56.2 (talk) 18:36, 26 October 2011 (UTC) It seems to me as though the 10x figure was attempting to adjust for inflation which would make it roughly correct. [[2]] An adjusted price would be more useful than comparing prices from 1983 and 2007. Currently it's incorrect as it states the price comparison as a current cost for a 1983 machine.Muleattack (talk) 20:23, 26 October 2011 (UTC) I concur. "What cost$2495 in 1983 would cost $5389.07 in 2010." was the result I got from that reference. Darrell_Greenwood (talk) 20:34, 26 October 2011 (UTC) ## Hey I made a lot of changes to this article. I didn't delete anything I just simply added invaluable information to it. Ok I added a lot of information to Moores Law. I gave everything good source citations so you can see that I'm not lying. So everything is sourced and citated. So yeah I added a shit ton of info. Long live Moores Law!!!! Long live Intel!!! Intel is number one!!!! Moores law to the future!!!! — Preceding unsigned comment added by TechnologyIsPower (talkcontribs) A shit-ton indeed. Dicklyon (talk) 03:11, 5 November 2011 (UTC) ## Future trends I removed a fair bit of this section, but it was reverted. To explain the problems with it. • It is mostly speculation. • A large part of it is unsourced. • Of the parts of it that are sourced, many of the cites say nothing whatsoever about Moore's Law. It is merely the opinion of the contributing editor that what they say has implications for Moore's Law. --Escape Orbit (Talk) 02:34, 5 November 2011 (UTC) It shouldn't be included, it has no specific relation to Moore's law. Muleattack (talk) 02:59, 5 November 2011 (UTC) I tend to agree, though I haven't looked through it carefully yet. The guy who reverted went too far back, and added some edits of his own at the same time, greatly complicating matters. I might want to hear from him before repeating the removals. Dicklyon (talk) 03:03, 5 November 2011 (UTC) Looking it over, I find too many sentences copied straight from sources, cited and otherwise; clear copyvio problems. And very little connection to Moore's law. So I'll go ahead and take it back to a state without that recent "shit ton" of new material. Dicklyon (talk) 03:54, 5 November 2011 (UTC) I support removal of the recent TechnologyIsPower (talk · contribs) materal in this section. This section is already overly long and speculative. It needs to be trimmed not expanded. --Kvng (talk) 15:34, 7 November 2011 (UTC) ## Hey its me TechnologyIsPower listen I know what the hell I'm talking about. At least I provide all the sources for my facts!!!! Hey I added a bunch of crucial but true information to this article!!!! I mean we're talking about moores law for crying out loud and everything I've added is true I proved the source for crying out loud. For fuck's sake it comes from Paul Otellini himself the CEO and president of Intel!!!! PAUL OTELLINI HIMSELF DAMMIT!!!!! Does that make any sense to you!!!! PAUL OTELLINI THE CEO AND PRESIDENT OF INTEL!!! Intel the number one semiconductor company in the world!!! FUCK!!!! YOU HAVE NO IDEA HOW MUCH YOU'RE PISSING ME OFF!!!! This is an encyclopedia ok! I'm trying to provide the truth here!!!! THE TRUTH!!!! The nature of the content aside, the actual text being inserted into the article is a copyright violation, because it's a verbatim copy from the various sources. This is why I'm reverting the insertion of the information, because copyright violations are a serious issue, and when presented with that, it is better to remove the potential copyvio and discuss the material before reinserting it, not simply commenting and then reverting. Please do not restore the information as it is currently written, and please note that merely changing a few words around is still a copyright violation. However, the substance of the content is being disputed as well, so I think you should other editors a chance to comment before reinserting the information into the article in any form. Thank you. - SudoGhost 14:21, 5 November 2011 (UTC) ## Misnomer Why is it called a law? It's not a law. It's just an observation. Has nobody of prominence ever raised this basic point? If someone has, the article should note it.108.36.209.26 (talk) 20:31, 29 February 2012 (UTC) We use the word "law" to describe any observation that has a precise form. Consider the law of diminishing returns, Gresham's law, Murphy's law, Engel's law, the law of supply and demand, Zipf's law, etc. What does the word "law" suggest to you, and how is it a misnomer in this case? ---- CharlesGillingham (talk) 18:37, 2 March 2012 (UTC) I don't think it's correct to say that we apply the word "law" to "any observation that has a precise form." If Gordon Moore had said, "Seminconductor performance has doubled every two years over the past ten years," he would have been making an obervation in a precise form. Would he have expressed a "law"? I wasn't familiar with all of the examples you've given, but I've followed your links to each of them. I won't discuss Zipf's Law, because I'm not sure that that is rightly termed a law, as I would use the term. Rather than try to define "law," I'll simply say that each of your other examples strikes me as something that can be restated as a conditional: Murphy's Law: "If there is something that can go wrong, it will go wrong." Gresham's Law: "If government gives official preference to one of two money types that are circulating among the same persons, the use of that type will diminish" (or whatever the idea is). Law of Diminishing Returns: "If a single factor of production is increased, while all other factors are constant, there will be a marginal decrease in the output of the production process." (I don't even have to know what that means — or even to know whether it's true — to recognize it as a "law.") Engel's Law: "If income rises, the proportion of it spent on food falls, even if the amount spent on food increases." Moore's Law can't be restated that way. What is the "law"? "If Earth continues to exist, semiconductors will double in performance every two years"? There might not even be a semiconductor industry tomorrow. How about, "If there is a seminconductor industry, its products will double in performance every two years"? That's not guaranteed. Moore had simply observed the advances that had taken place in semiconductor development up to the time at which he made the remark; on the basis, presumably, of his understanding of the processes involved and various techniques that were being explored or imagined, he said he thought the advances would continue for some time at the rate at which they'd been occurring. He wasn't saying that, if semiconductor development were to continue, that would necessarily happen. In a sense, I suppose, nothing at all will "necessarily happen." Tomorrow, some jet airliner might explode in flight because a gas will not have behaved as Boyle's Law had led the airliner's designers to expect it to behave. That would probably force abandonment of Boyle's Law. Next year, the use of some set of currencies might not conform to Gresham's Law. Even so, the expectation that is created by a "law" must, to my mind, involve a condition other than the mere existence of something: "If a seminconductor industry exists, its products will double in performance every two years." Says who? "Gordon E. Moore said it." No, he didn't; that would have been foolish. Boyle's Law, in contrast, says that if the pressure on a gas is multiplied by x, the gas's volume will decrease by 1/x (or whatever the idea is).108.36.209.26 (talk) 06:49, 4 March 2012 (UTC) PS Suppose that tomorrow and every day thereafter, for a total of two years, every person in the semiconductor industry were to have not a single idea. Would semiconductor performance over that period increase at what might reasonably have been termed "Moore's Rate"? If Moore's Law is a law, as I understand that term, it would; somehow, the universe would ensure it: "How do you like that? Every designer, engineer, and technician in Silicon Valley has been afflicted with Alzheimer's disease since this day two years ago, yet semiconductor performance has doubled since then." "How do you explain that?" "I don't know; it's Moore's Law."108.36.209.26 (talk) 08:24, 4 March 2012 (UTC) You can write Moore's law as a conditional, of course: "if the cost per byte of memory is x today, it will be x/2 in two years." I think you are thinking only of "natural laws" (like Boyle's law). That's why all my examples were from economics or statistics. There are other kinds of laws besides physical laws. I think the difference is in how we answer the question why?, i.e., what we see as the cause of the law. Physical laws are caused by "the universe" (or, as Stephen Hawking would have it "the mind of God"). But economic laws have much more mundane causes -- they're caused by choices made by consumers and firms and so on, statistically averaged over large numbers. ---- CharlesGillingham (talk) 12:36, 4 March 2012 (UTC) I'm not thinking only of physical laws. I recognized your non-physical examples as laws. To recast Moore's Law as a conditional is to make a statement that is unfounded; that was my point. Let me give you two other examples — completely imaginary. Let's say a (fictional) tobacco-company executive named John Smith says, in 1930: "Sales of cigarettes have doubled every two years for the past ten years and will probably continue to grow at that rate for the next ten years." Would it be reasonable to state the following, as "Smith's Law": Cigarette sales double every two years. Let's add that Mr. Smith quits his tobacco job and takes a job with the Census Bureau. One day, he says, "U.S. population has doubled every quarter-century for the past century and will probably continue to grow at that rate for the following century." Would it be reasonable to state the following, as "Smith's Law": U.S. population doubles every quarter-century. Neither one of those statements would be justified by what Mr. Smith had said; and it would be very odd, to my mind, for someone to utter either of them, particularly as "a law." As I suggested above, someone might reasonably shorthand either of Mr. Smith's observations as, say, "Smith's Rate," meaning the rate that Smith had identified at such and such a time and that he had thought would continue for a while. Accordingly, a person might say, for example, "U.S. population continues to grow at Smith's Rate." It would not be legitimate to recast the rate as a conditional — in the way you restated Moore's Law — by saying, "If U.S. population is x today, it will be 2x a quarter-century from now." That is not known — and it is certainly not what Mr. Smith said.108.36.209.26 (talk) 17:19, 4 March 2012 (UTC) If you want to add something about why it's called a law, do the research, bring a source you can cite, and edit the article. Further discussion here is pointless without a source. Dicklyon (talk) 17:57, 4 March 2012 (UTC) For the record: I was suggesting that somebody might add a source that remarks that it should not be called a law. This is a simple point — a linguistic one — and I thought someone with a particular interest in editing the article might be interested in researching it. Mr. Gillingham — I'll make a further remark: The Wikipedia article's present footnote 67 cites (and links to) a 2003 interview in which Dr. Michio Kaku says the following: [W]e physicists are desperately trying to patch up Moore’s Law, and at the present time we have to admit that we have no successor to silicon, which means that Moore’s Law will collapse in 15 to 20 years. Moore's "law" might collapse at some point? How can a "law" collapse? Can you imagine an equivalent statement about Gresham's Law or the Law of Diminishing Returns or the other examples that you've given (and that I've addressed)? I can not — and I don't think that's because I have a feeble imagination. I'm pretty sure it's because I know the proper use of the word "law." (I hope I needn't explain that my concern here is not with Dr. Kaku's prediction itself. I don't care whether he's right or wrong. My point is that his statement makes clear that Moore's "Law" is not a "law.") PS In examining the archives of the present talk page, I see that this subject has been brought up by someone else (in January 2009, under the heading "Is Moore's Law Really a Scientific Law?").108.36.209.26 (talk) 20:16, 4 March 2012 (UTC) Yes, that's a good source for the fact that this "law" is not expected to work forever. Its end has been often predicted. More generally, exponential growth can't last forever; Verhulst substituted a logistic function, which Lotka termed a "law of population growth" later; is that a law? Dicklyon (talk) 04:51, 5 March 2012 (UTC) I'm afraid the item you've linked is well beyond my comprehension; I won't comment on it. Whether exponential growth can last forever is irrelevant. This isn't a technical or mathematical issue. As I've said, it's simply a linguistic issue. Early in the history of semiconductor development, Moore took a moment to observe the development's rate. I don't know exactly what he said, but apparently it was something like, "The number of transistors that can be fit into a given space has doubled every year for the past ten years." He wasn't saying that that was some sort of natural phenomenon, like, say, the rate at which a colony of bacteria in a Petri dish might grow. He simply observed that that was the rate of success to that point; he added that he thought that the development would continue at that rate for at least some time (another ten years — or whatever he said). He wasn't positing a "law" of semiconductor development. Suppose someone had announced, the day after his remark, that semiconductors cause cancer — and resultantly, the nascent semiconductor industry had collapsed. Would the great minds of the world have had to gather to determine what happened to "Moore's Law" — as if the sun had burned out fifteen billions years earlier than expected? Somebody (maybe the person named in the Wikipedia article) rather curiously — not to say inanely — referred to Moore's observation as a "law," and the term has been parroted ever since. That's all that's happened here; I'm surprised nobody brought it to an end a long time ago.108.36.209.26 (talk) 09:09, 5 March 2012 (UTC) PS I'll venture one comment about the "law of population growth" that you linked — even though, as I've indicated, the material — the math — is well beyond my comprehension. You've asked me, in effect, whether I think that Mr. Lotka was right in referring to a "law" of population growth. My guess would be yes, by which I mean that, to the extent the reasoning underlying his curve (if that's what it is) is valid, the curve represents a "law" of population growth. Presumably, Mr. Lotka would never have to make a remark like the one Dr. Kaku made in the article discussed above. That is, he wouldn't have to explain that there has arisen some situation in which he's trying to "patch up" his "law" of population growth, because it's facing collapse. He either thinks it's a "law" — or doesn't. That doesn't mean he would never modify it or reject it. At some point, he might become aware of something that leads him to believe his original reasoning was faulty. He might modify or even reject the law; but to do so would be to say that the law had not been stated correctly, i.e., that it had never applied, that it had represented a misunderstanding. The rate Gordon Moore observed was not, as far as I know, erroneous; it didn't represent a misunderstanding. Apparently, it was the actual rate of growth during the period he studied — and for some time thereafter, as he foresaw; it might not be the rate of growth tomorrow. That's all.108.36.209.26 (talk) 09:53, 5 March 2012 (UTC) And would I be presuming too much if I were to say we might want to listen to the man himself — Dr. Gordon E. Moore? Here, from the April 19, 1965, issue of Electronics Magazine, is the very passage in which he offers his observation — the very passage that is quoted in the Wikipedia article: The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. Do you see the word "law" there? I don't. Do you see the word "uncertain" there? I do. The entire article, in which Dr. Moore explains some basics and discusses some possibilities, is worth reading.108.36.209.26 (talk) 00:54, 6 March 2012 (UTC) No, Moore didn't call it a law; it was dubbed "Moore's law" some years later, by Carver Mead. It's fine to quibble about what it's called, and to speculate on when it will break down, as many have done, but "Moore's law" is what it's called. I don't see a great problem with that. Dicklyon (talk) 01:14, 6 March 2012 (UTC) I'm not quibbling — but I've said all I'm in a position to say on the subject.108.36.209.26 (talk) 02:42, 6 March 2012 (UTC) No — I will say a further thing: a forecast of a rate of progress is not the enunciation of a law. That shouldn't even have to be said — but apparently it does.108.36.209.26 (talk) 04:57, 6 March 2012 (UTC) At least it's not mathematically absurd, like the Weber–Fechner law. Dicklyon (talk) 05:59, 6 March 2012 (UTC) I'm afraid I don't have the mathematical ability to appreciate the absurdity you see there, but thanks for the link. I'd not known someone had tried to quantify what Weber and Fechner tried to quantify. That, I think, is what is most striking about Wikipedia: how easy it makes it to learn of a subject.108.36.209.26 (talk) 06:49, 6 March 2012 (UTC) ## Moore’s Law is about speed, not about the number of transistors "Moore’s Law is a computing term which originated around 1970; the simplified version of this law states that processor speeds, or overall processing power for computers will double every two years... From 2000 – 2009 there has not really been much of a speed difference as the speeds range from 1.3 GHz to 2.8 GHz, which suggests that the speeds have barely doubled within a 10 year span." source: http://www.mooreslaw.org/ Quinacrine (talk) 19:32, 2 June 2012 (UTC) The site referenced contains only the one page, unsigned. In other words it is not a very good source, just an anonymous person's unsigned opinion. Darrell_Greenwood (talk) 21:32, 2 June 2012 (UTC) ## Two years? 18 months? One year? I don't understand why there is so much quibbling over the rate in the first paragraph. This is a historical trend; there can be only one right answer (within a margin of error, of course). Can we get a source that gives the correct number? I also think that you should avoid bringing up small complications in the first paragraph of an article. This is poor writing. The reader needs to know what the basic idea is and why it is interesting and important. ---- CharlesGillingham (talk) 02:39, 6 December 2011 (UTC) The cited sources support exactly what it says, and they're linked and free for anyone to check. If you can improve the presentation, please do. Dicklyon (talk) 02:50, 6 December 2011 (UTC) I have removed the 1 year figure from the lead. It's accurate but historical and distracting. All the gory detail is covered and referenced in History section. --Kvng (talk) 16:24, 7 December 2011 (UTC) 21 February 2012‎ MtneerInTN edit should be undone, because it replaces a valid reference to a wikipedia chart reference, which is not allowed, because wikipedia sources cannot be used as references. — Preceding unsigned comment added by 186.153.227.144 (talk) 17:30, 22 February 2012 (UTC) ## GA Review This review is transcluded from Talk:Moore's law/GA2. The edit link for this section can be used to add comments to the review. Reviewer: Astrocog (talk · contribs) 21:54, 7 December 2011 (UTC) Initial thoughts: • There are two links that need to be routed to specific pages. See dablinks for details. • Make sure all images have alt-text. Currently none of them do. • Lots of issues with external links. See the report for details. I recommend finding archived versions of the websites using the Way Back Machine when possible. This eliminates issues with retrieval. GA review (see here for what the criteria are, and here for what they are not) 1. It is reasonably well written. a (prose): b (MoS for lead, layout, word choice, fiction, and lists): • I would expect that the lead would credit Gordon Moore in the first or second sentence, rather than the third paragraph! In fact, that third paragraph should probably come first. ~KvnG 16:42, 8 October 2013 (UTC) • Not sure why there has to be all those separate subsections in the "Consequences" section. Just make them regular paragraphs. ~KvnG 17:37, 5 December 2013 (UTC) • Do not put notes or wikilinks inside of direct quotes. This is all over the article. ~KvnG 14:03, 28 March 2014 (UTC) • The "See also" section is too big. Several of the items on the list are already linked in the article, making them unnecessary in this list. Other items are unnecessary, such as "quantum computing" and "Second half of the cheesboard". Wha? Unless it's directly related to the subject of the article, and would be of direct interest to a general reader of this article, don't include it in this section. • OK, so you wikilink "K" for Kelvin, but not "C" for Celsius? C'mon, editors. ~KvnG 15:43, 28 March 2014 (UTC) 2. It is factually accurate and verifiable. a (references): b (citations to reliable sources): c (OR): • Plenty of references throughout, and the citations are mostly written properly. Some do not have enough information, such as author, date, publishers (for example, #1...). The section on "Major enabling factors" is pretty much synthesis and OR. A cited source in this section is from an related paper from 1963. No other reference is given that this source, older than Moore's own paper, was "enabling". Complete OR. Other innovations cited here, such as the IBM/Georgia Tech speed record, have a reference that doesn't even mention Moore's Law. More synth. This is not becoming of a Good Article. This article editors need to take a close look at the text and the attached sources to fix problems like this. 3. It is broad in its coverage. a (major aspects): b (focused): • Article hits the major points. • However, this article replete with digressions and trivia. Does a general reader really need to know that Intel gave$10,000 to a man from the UK for his magazine? Maybe if there was some surrounding context for the bit, but right now that sticks out as trivia. Almost as soon as the article begins, it goes off-topic into "alternate formulations", even while admitting that "Moore himself wrote only about the density of components (or transistors) at minimum cost." So, stay on-topic. Get the material about Moore's Law at the beginning and put the offshoots and alternative formulations at, or near, the end of the article.
• The whole section on futurists is too much. It could be condensed into one or two paragraphs, minus so many indulgent quotes, and not lose any of the content.
4. It follows the neutral point of view policy.
Fair representation without bias:
Doesn't seem particularly biased.
5. It is stable.
No edit wars, etc.:
In the last month, there was an edit war. Looking at a longer time span in the past few months, I see there is significant back and forth in terms of the article's size. This article is currently not stable by my reckoning.
6. It is illustrated by images, where possible and appropriate.
a (images are tagged and non-free images have fair use rationales): b (appropriate use with suitable captions):
• All images need alt text.
• Lose the animated image. It doesn't illustrate anything meaningful for a general reader, and its caption is complete traxoline.
• The main image is small and difficult to read. Are you sure you want this to represent the concept? Can a more concise image easily readable at this resolution be made?
7. Overall: This article has some serious problems that prevent me from passing the GAN, or even putting it on hold. Three major issues: prose/MOS, original research, and focus. This article seems to be half about topics OTHER than Moore's Law. There are numerous instances of synthesis from sources that don't directly mention Moore's Law. The prose, and even image captions at one point, are too technical for general readers. I'm not saying it needs to be dumbed down. It needs to be fundamentally rewritten to focus on the subject itself and explain the broad points to a general reader. Avoid digressions and speculations.
Pass/Fail:

21 February 2012‎ MtneerInTN edit should be undone, because it replaces a valid reference to a wikipedia chart reference, which is not allowed, because wikipedia sources cannot be used as references. — Preceding unsigned comment added by 186.153.227.144 (talk) 17:30, 22 February 2012 (UTC) — Preceding unsigned comment added by 186.153.227.144 (talk)

Prefetching disambiguation - It seems clear to me that this refers to instruction prefetch ( since the issue is with unused processors, prefetching the next instruction would save time if the instruction is used).

Also: The internet, particularly cloud computing is another circumvention of the physical limits of a computer. since both memory and processing power available to the computer user have increased by using the net, while staying within moderate size weight and cost. a paragraph on this would not be out of place.

since there has been an edit war i'm not adding anything - if my ideas are of value then someone else will take them up. Sosci (talk) 19:11, 10 February 2012 (UTC)

Here is a quote from an Intel manual (Intel® 64 and IA-32 Architectures Software Developer’s Manual, download PDF from the Intel website): " In the mid-1960s, Intel cofounder and Chairman Emeritus Gordon Moore had this observation: “... the number of transistors that would be incorporated on a silicon die would double every 18 months for the next several years.” Over the past three and half decades, this prediction known as “Moore's Law” has continued to hold true." (vol. 1, page 2-29) I have not edited the page to reflect this, but I did undo someone else's (146.7.40.40) change from two years to five years and 18 months to 60 months. As I'm new to this, I forgot to mark that change as vandalism. — Preceding unsigned comment added by 74.113.189.228 (talk) 14:58, 5 September 2012 (UTC)

## Nonsense

It is more important for titles to be recognizable than pedantic. See WP:TITLE. All pompous statements should be cited and possibly quoted. Which statements are you referring to exactly? --Kvng (talk) 13:12, 17 May 2012 (UTC)
The exponential increase in the power of digital electronics has effected nearly every segment of the world economy. This fact is stated in the article with a solid citation, and is also commonly known. ---- CharlesGillingham (talk) 20:31, 19 June 2012 (UTC)
Really, "nearly every segment of the world economy"? Construction? Agriculture? Transportation? Sure, they benefit from improved information systems, but only in a smallish way, not really exponentially, compared to the information parts of the economy. Dicklyon (talk) 21:39, 19 June 2012 (UTC)
The article only says every segment was affected. It doesn't say the effect is large. Transportation and agriculture were revolutionized by the introduction of computerized inventory control in the late 80s and 90s. ---- CharlesGillingham (talk) 09:50, 30 June 2012 (UTC)
I feel like my answer above was a little glib, and you (Dicklyon) have questioned this point in the past, so I thought I would add a bit more about it.
Computerized inventory control was enabled by the affordability of bar-code readers, point-of-sale computers (i.e. "smart" cash registers), and "intra"-net servers and modems. None of these devices were cost-effective with the digital technology of the 1970s. In the 80s, in accordance with Moore's law, these devices became affordable and their adoption revolutionized the "supply chain" industries (agriculture/manufacturing, transportation, retail), productivity numbers went up as computerized retail companies (i.e. Walmart and the "box" stores) drove their less efficient competitors out of business.
The Rauch article (cited in the article) discusses in detail how the oil industry (usually considered part of the "old economy") improved productivity exponentially throughout the 90s when computer technology was applied to oil discovery and extraction.
Rauch goes on to document how computer technology is behind much of the growth in productivity numbers throughout the world economy (even the "old economy") as one industry after another discovers that computers are finally able to solve a particular problem more affordably. As the price of computers drops further, productivity numbers (for a time) go up exponentially. By the time one industry has finished "computerizing" some process, some other industry is adopting newer, faster computers to another process. Thus, Rauch argues that global economic productivity numbers are tied to the falling price of digital electronics, and thus to Moore's law.
I think this point is important for the article, because it explains the significance of Moore's law to the world as a whole, its significance in the history of technology, economics, and, indeed, its significance to world history in general. (I would like to research and write a section for the article that details this, but I haven't had the time.)
Note that this has nothing to do with how futurists have adopted the term "Moore's law" to mean "progress" or something equally abstract. This has to do with the specific economic impact of the exponentially falling price of an extraordinarily versatile product over the last fifty or sixty years. ---- CharlesGillingham (talk) 23:05, 4 July 2012 (UTC)

## i cant find moore's law related to nanotechnology separate section in this article.

i cant find moore's law related to nanotechnology separate section in this article. provide separate section featuring moore's law which is related to nanotechnology.Ram nareshji (talk) 04:42, 19 October 2012 (UTC)

## Main Image

What is this a picture for ants? This article could really use a higher res one.216.246.130.20 (talk) 03:48, 22 May 2013 (UTC)

Yes, we shrink the image by a factor of two (in area) every two years. Soon even ants won't be able to read it without clicking on it. Dicklyon (talk) 05:40, 22 May 2013 (UTC)

## Properly, Moore's law is a "rule of thumb"

Properly, Moores law is a "rule of thumb" derived from the "observation" that computing power doubles every 18-24 months.

Per wikipedia "A rule of thumb is a principle with broad application that is not intended to be strictly accurate or reliable for every situation. It is an easily learned and easily applied procedure for approximately calculating or recalling some value, or for making some determination.

If this does not describe Moore's law, then what does it describe?

Also, my English professor mom always used to claim that long complicated sentences should be cut in pieces for clarity and that one short word is always better than two. Clipjoint (talk) 18:20, 7 March 2012 (UTC)

Because Moore's law involves a forecast, it's not what I personally would normally think of as a rule of thumb — but yes, I suppose you could call it that.

I only chose the best Pix/$cameras as they came along, and I agree teh line looks suspiciously straight, but I can only plot the data I collected! Happy to debate ... although recollecting the raw data might be a challenge! Barry Hendy Barry.hendy (talk) 10:08, 2 December 2013 (UTC) It seems to me you should restore it (if you haven't already). ---- CharlesGillingham (talk) 04:47, 27 January 2014 (UTC) I don't see anything I'd call a WP:RS on this. And if there is, it shouldn't be the author with the WP:COI who restores it. Dicklyon (talk) 08:07, 27 January 2014 (UTC) I had a private discussion with Barry.hendy and added back a brief discussion with references. I did not restore the graph because of WP:RS and WP:OR concerns. ~KvnG 15:10, 30 January 2014 (UTC) ## Murphy's Law and Quality Control Line 182 (I think): "Moore says he now sees his law as more beautiful than he had realized: 'Moore's law is a violation of Murphy's law. Everything gets better and better.' Some believe Murphy's law has been cleverly hidden from end users by increasingly skilled quality control." The first bit is attributed to Gordon Moore himself, but the latter is currently my own statement (I dropped a {{cn}} tag on it.) We have all kinds of articles about the technical specifics of Murphy's law (paraphrased: "Anything you allow to go wrong will go wrong") and not allowing too many things to go wrong, even as transistor counts soar into the trillions. This includes design-for-test circuitry, radiographic inspection, error correction coding, automated test equipment, skilled process engineering and FCC/IEEE/IPC/ISO/JEDEC/ETC/ETC/ETC quality standardization. Murphy's law can't be truly violated in the manner Moore describes, and the industry does know this. I just haven't seen a published generalization of that, except my own, obviously. I'm guessing there has to be one already out there for the finding. Featherwinglove (talk) 01:46, 13 February 2014 (UTC) ## See also I made some progress cleaning up the See also section as per GA review suggestion. I believe the following links don't belong in the See also section but the article should be improved to include them as links in the body. ~KvnG 14:36, 28 March 2014 (UTC) ## Should "processor speed" link to Clock rate? The introductory paragraph states "The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed". I assume that what the original author meant was performance not merely clock speed. If so, having "processing speed" link to Clock rate would be somewhat misleading. — Preceding unsigned comment added by Paul A. Clayton (talkcontribs) 17:51, 8 June 2014 (UTC) Yes, the clock speed trend is flat for the past decade, and performance isn't easily measured. Transistor count is another obsolete performance metric. During the past decade, quality adjusted microprocessor price indexes, which try to incorporate performance, indicate that Moore's law advancement has continued.((ref. Byrne(2013)) I've replaced clock speed with microprocessor prices. There are likely many more Moore's law indicators. — Preceding unsigned comment added by 71.128.35.13 (talk) 21:25, 9 June 2014 (UTC) ## GA review improvements A 2011 GA review by Astrocog (talk · contribs) contains some good suggestions for improvements. Not much of it has been addressed in the current version. I'm inclined to implement these suggestions when I have time unless anyone finds anything in these suggestions to be controversial. ~KvnG 17:45, 2 August 2013 (UTC) I've been watching this article for about five years, and it seems to me that nobody really wants to take the time to take it to GA or FA, so if you want to take it on, that would be great. The article suffers from a large number of "drive-by" edits, some of which help, some of which don't. There are several of us watching here to try to prevent nonsense and keep the balance. Only the worst edits actually get fixed. Thus the article is quite spotty over all without a real editorial hand to keep it organized, even out the prose, keep the balance, and track down the sources. ---- CharlesGillingham (talk) 04:44, 27 January 2014 (UTC) This article was over-hyping the excimer laser before recent edits, as the excimer laser main article still does. Other innovations deserve mention, like the sensitive photoresist that enables use of the laser. 71.128.35.13 (talk) 03:55, 21 May 2014 (UTC) V93a98 wrote the following edit summary on 2 June 2014‎: Undid good faith revision 610969766 -- reference is inappropriate, just as when describing invention of a new surgical procedure, one does not cite invention of the knife. Proper place for the reference is in article on excimer laser, where it exists. Eons separate the invention of the knife and of the new surgical procedure, and these two don't sound alike. Mere centuries separate the invention of telescope lens polishing and of a new chemical-mechanical wafer polishing procedure. Nonetheless, the article mentions the antecedent polishing to provide insight, credit to the original inventors and historical context. Bear in mind that just one decade separates the excimer laser in 1970 from the excimer laser for photolithography c. 1980 (Jain, Willson patent). Moreover, these two are confused easily, because of their similar nomenclature. Lasing excimers may sound alike to readers who are not immersed in the photonics or in the semiconductor photolithography equipment industries. And even to some who are. Therefore IMO and in the opinion of D. Greenwood, one should cite both the new invention and its antecedent. Secondly addressing V93a98 in particular, one would not remove good-faith edits (as you did on 31 May and 2 June) without discussing first on the talk page. Under these circumstances, your deletions might seem inappropriate.71.128.35.13 (talk) 23:22, 12 June 2014 (UTC) ## Future trends Future trends omit the most important developments extending Moore's law. These are the real world technologies which are extending Moore's law today. That's not speculation: Vertical stacking (Three-dimensional integrated circuit): Already in use on consumer products, like Samsung SSD 850, which piles 24 layers. Many-valued logic: Also on sale in consumers products, and on his 3rd generation! Again, a consumer product on sale is the Samsung 840, which stores 3 bits per cell, and is capable of manage 8 binary values at once. this page had a reference, but it was deleted. That's nonsense, since these technologies are the real world ones extending Moore's law. — Preceding unsigned comment added by 181.20.137.16 (talk) 22:37, 7 July 2014 (UTC) Both IBM (2014, http://www.prnewswire.com/news-releases/ibm-announces-3-billion-research-initiative-to-tackle-chip-grand-challenges-for-cloud-and-big-data-systems-266536231.html) and Kahng (2014, http://vlsicad.ucsd.edu/Publications/Conferences/306/c306.pdf) support your observation that new directions are needed. IBM (2014) is expanding research beyond CMOS scaling. Kahng (2014) notes that: the 'heartbeat of the roadmap' – Mx pitch scaling – is slowing due to many reasons that are not directly related to patterning. These reasons, spanning material properties, variability and design margins, electrical performance, and design tool limitations, have reduced the design benefits of recent technology nodes” Kahng (2014) goes on to say: realities of cost and risk force aggressive exploration of 3D scaling (for NAND flash and multi-die integration) and heterogeneous integration (More Than Moore, and beyond-CMOS) paths for future semiconductor products. Lithography and patterning have been joined by 3D scaling, deposition, etch, planarization, next-generation interconnect materials, etc. as first-class enablers of the continuation of Moore’s Law. 71.128.35.13 (talk) 22:41, 10 July 2014 (UTC) ## Picture caption question What year/version smartphone does the picture caption refer to? I am referring to the current version of this article as of this comment, there is a picture of an old computer, the caption under the picture reads: "An Osborne Executive portable computer... weighs 100 times as much, has nearly 500 times ... 1/100th the clock frequency of the smartphone." I don't know what "the smartphone" is, since each year they keep getting better and better (... which is kind of the point of this article, no?...) More generally, should this caption be qualified and referenced? I would suggest replacing "the smartphone." with something like "the Google Nexus 5 smartphone released in 2013 [ref].". 128.102.242.117 (talk) 19:55, 11 June 2014 (UTC) It clearly states in the caption that it is a '2007 Apple iPhone with a 412MHz ARM11 CPU' -- Wgsimon (talk) 21:34, 11 June 2014 (UTC) The caption states the Osborne cost ten times as much as the iPhone after adjusting for inflation. The Osborne listed for$2,495.00 in 1982 which is equivalent to $6,100 in 2014. Did the iPone actually cost$600 in 2007? I don't have trouble remembering prices of computers in the early 80s but I'm not sure the iPhone cost that much. If I'm wrong, I apologize.209.179.21.14 (talk) 03:10, 16 July 2014 (UTC)

From the iPhone article:
The two initial models, a 4 GB model priced at US$499 and an 8 GB model at US$ 599, went on sale in the United States on June 29, 2007...'
According to www.measuringworth.com the inflation adjusted price of the Osbourne is \$5,360.00 in 2007 dollars. This is 10.7 or 8.95 times the price of the phone. -- Wgsimon (talk) 17:19, 16 July 2014 (UTC)
Oh boy, I'm always so busy calculating inflation to today that I completely spaced on doing it for 2007. D'uh!
I also forgot to make a couple of other points. Comparing an Osborne to an iPhone has two problems. One is that iPhones come with Apple's inflated price tag, which skews the price value compared to a modern day device. Also comparing the iPhone to an Osborne is like comparing apples to oranges (sorry about that). The Osborne includes components the iPhone doesn't have, which inflates its costs and skews the cost value relative to the Osborne. I wouldn't even compared an old computer to a phone, since part of the phone's cost is the service plan. (I realize the compactness of the phone is part of the reason to use it,) It would be better to compare two computers that are more comparable to each other, which I admit isn't easy, given all of the technological changes over the years.209.179.21.14 (talk) 01:38, 20 July 2014 (UTC)

## I find it hard to believe this law still works.

It's for a few years now that I notice a distinct slow down. In the 90s, early 00s and certainly the 80s I distinctly remember computers almost doubling in power in almost a year and now it appears to take around 4-5 years. And no, I don't think a couple more cores on a x86 actually double its power, and we even stopped doing that.

And this is not unsourced. There are sources explicitly calling for a noticeable slowdown. e.g. I just watched a Michio Kaku video with him explicitly calling it. I'm just saying this article sounds too favorable to the law compared to the obvious signs. --fs 03:52, 13 December 2012 (UTC)

Moore's Law is about the doubling of transistors, not performance. Tom.gangemi (talk) 15:45, 31 July 2013 (UTC)
We need sources that say this. ---- CharlesGillingham (talk) 04:39, 27 January 2014 (UTC)
The slow pace of the quality adjusted IT equipment price index since 2010 supports your observation. 71.128.35.13 (talk) 03:28, 21 May 2014 (UTC)
First of it's not a law in any scientific (natural) sense. Second its definition is not the true reason it's true. The true reason lies in costs, efficiency, and yield. Developing process technology isn't cheap and yield vary wildly. These limit the choices for the next step one wants to take. The simplest minimum improvement or smallest improvement factor that is acceptable satisfying is two, hence, this is the choice chosen all the time. That's why doubling is always true but not necessarily time in between. Mightyname (talk) 12:24, 25 August 2014 (UTC)
Just for the fun of it a "truer" "law" would be:
${\displaystyle P_{\text{(n)}}={\frac {P_{\text{(n-1)}}}{\sqrt {2}}}}$
where P is a manufacturing process and n the process generation. Compare it with historic records in Semiconductor device fabrication. Mightyname (talk) 13:15, 25 August 2014 (UTC)

## "Moore's law" for ARM faster?

With Apple A8 at 2 billion transistors, I saw that CPU speed is 50x the original iPhone (note, not since original ARM). Isn't the speed growing faster than for say Intel (I know the "law" is about transistors). Even counting transistors desktop Haswell is about the same (2 billion?) and if ARM used fewer than Intel in the past and same or similar now (or relatively now higher fractiopn) then also applies for transistors. comp.arch (talk) 11:39, 16 September 2014 (UTC)

## "predicted that growth will slow at the end of 2013"

i tagged this with "when", and i'm sorry if that's wrong, but i don't know how else to draw attention to the fact that it seems wrong to use a sentence wording with "will" at this point of the sentence. shouldn't it be a quote instead? writing "would" seems to go against the guidelines of not using timesensitive language. — Preceding unsigned comment added by 195.249.185.2 (talk) 19:23, 23 September 2014 (UTC)