Jump to content

Talk:Moore's law

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 199.125.109.107 (talk) at 06:27, 30 May 2008 (→‎Survey). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This template must be substituted. Replace {{Requested move ...}} with {{subst:Requested move ...}}.

Good articleMoore's law has been listed as one of the Engineering and technology good articles under the good article criteria. If you can improve it further, please do so. If it no longer meets these criteria, you can reassess it.
Article milestones
DateProcessResult
June 26, 2006Good article nomineeListed
WikiProject iconComputer science GA‑class Top‑importance
WikiProject iconThis article is within the scope of WikiProject Computer science, a collaborative effort to improve the coverage of Computer science related articles on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
GAThis article has been rated as GA-class on Wikipedia's content assessment scale.
TopThis article has been rated as Top-importance on the project's importance scale.
Things you can help WikiProject Computer science with:

Opening comments

Had anyone actually heard or read this stated by Moore firsthand? If yes, would you provide the reference please?

I've read that Moore vigorously objected the attempts to attribute this nonscience nonsence "law" to him. Unfortunately I have no reference either (otherwise I'd intervened into the article). Does anyone remember anything about this? I mean documented, rather than anecdotal.

mikkalai 21 Nov 2003

Thanks to Wernher for providing the reference. After reading the paper, I strongly suspect that nearly nobody who wrote about the Moore's law bothered to read the original paper. Besides the "law", it is interesting in itself. As for the "law", I can see now why Moore could object to assuming the authorship: the degree of twistedness of various popular formulations of Moore's law doubles every 18 months :-). Even wikipedia's previous text was pretty much senseless, if you think about it carefully for a minute: "the number of components on lowest-cost semiconductor chips doubles roughly every 12 months". No wonder G.Moore was pissed off.
mikkalai 21 Nov 2003
http://news.bbc.co.uk/1/hi/sci/tech/4449711.stm A BBC news article from April 2005, complete with some words from an interview they conducted with him about the 40th anniversary of the original article. He seems to be backing it now, and the original magazine article authored by him is quite real (since Intel just paid 10,000 for an original copy of it).--Wingsandsword 23:56, 11 May 2005 (UTC)[reply]

.

Opening comments, another issue

Comment about using the analogy in the opening paragraph. "... If Moore's Law were applicable to the airline industry, a flight from New York to Paris in 1978 that cost $900 and took seven hours, would now cost about $0.01 and take less than one second." This sentence is nonsensical. The comparison is baseless. The two subjects have no foundation for this type of comparison and are completely different. Worse, this comparison provides absolutely no information to the reader or gives insight into Moore's law. Note that applying Moore's law to human evolution and the number of digits on our right hand, in 1978 human babies had 5, in 2006 human babies would have 81,920 fingers. Or perhaps compare this to the speed of light in 1978 and the speed of light now. These are almost as useless as the original comparison. A better analogy would be to apply this to something that is actually increasing or decreasing each year.

Also, the term 'now' should probably be replaced by some date, unless you want to keep updating this every day.

I removed this sentence. 88.73.211.151 13:31, 20 September 2006 (UTC)[reply]

There are images and lots of info about Moore's law here

66.167.137.70 08:05, 13 Apr 2005 (UTC): Intel has a presskit about the law's 40th anniversary. It offers photos and other bits which would be interesting to incorporate into Moore's law:

This image in particular seems like a nice addition:

Request for picture

I put up a request for picture of transistor counts plotted against the doubling rate. Cburnett 16:15, Apr 13, 2005 (UTC)

Is this kind of thing of use? I'm not sure how to label it. If you were thinking of something else please let me know. -- Wgsimon 04:08, 19 Apr 2005 (UTC)
Good enough for me......and added. Thanks! Cburnett 04:18, Apr 19, 2005 (UTC)

Could the intercepts of the trend lines be changed slightly? Is there any reason why the lines must both go through that one point (4004) and miss every other point? Moore made this statement in 1965, but the 4004 point is 1971. Maybe do a least-squares fit of the lines to the all the data (adjusting the intercept, but maintaining the two slopes), or even just the early portion of the data. --kris 20:33, 2 August 2005 (UTC)[reply]

That is a good idea. I will update the image. - Wgsimon 20:46, 2 August 2005 (UTC)[reply]

It would be nice to have a diagram showing the fujure projection of Moore's Law, incorporating any theoretical maximums. Rls 15:22, August 21, 2005 (UTC)

Theres probably now theoretical maximum about how many calculations per second (well apart from something absurd like needing all the atoms in the universe or something), only limits on the size of the internal transistors, and who knows an entirely new method of doing operations may be developed. Rob.derosa 00:00, 1 September 2006 (UTC)[reply]

I think it's about the time to update the picture to 2007. --Andrea Censi 08:02, 6 June 2007 (UTC)[reply]

More positive formulation

Quite some time ago I expanded on the criticisms that the "so-called law" was being mythologised, based on various critiques in the internet references. The new charts provide good evidence that there is indeed a genuine phenomena operating, not just a folk story. I have accordingly removed what appear to be poorly founded criticisms. 194.202.76.66 09:42, 12 July 2005 (UTC)[reply]

Quantum computers

I heard that the development of [[Quantum_computer]|Quantum Computers] so far also seems to follow Moore's law roughly, this may be relevant in the complexity discussion. Unfortunately I don't have any references. Erik Sandberg

100 GHz?

" ... could put 100 GHz personal computers in every home ... ": Is this true? 100 GHz means that one clock cycle lasts seconds, and during that time, light can travel only about 3 mm. I think this means that a chip with high frequency and many transistors needs to include a very large number of primitive parallel CPUs; this leads to limitations that may be worth mentioning. Erik Sandberg

I think you don't fully understand all the implications of what you are saying. People keep finding ways to make smaller transistors, and ways to pack them closer together on a chip. Because photons require less time to travel across new versions of a chip that have been compacted to a smaller size, the new, smaller chip runs faster.

So if I can pack an entire room full of transistors (say, the ENIAC or the PDP-1) into a space much smaller than 3mm, it will run at 100 GHz.

Some people [1] claim that there is a huge demand for devices with a die size of less than 1 mm square.

I fail to see why such a device needs to include a very large number of primitive parallel CPUs, although that certainly seems to be a good idea.

Certainly all the limitations you can think of should go in the main article -- "transistors can't be made less than 1 atom wide", etc.

--DavidCary 05:59, 10 November 2005 (UTC)[reply]

This is all wrong. Clock frequency and chip size are independent. Mirror Vax 02:41, 24 December 2005 (UTC)[reply]



There is a limit on the clock frequency. Electronic theory says that for switching a transistor, a certain ammount of power (P) proportionate with current and voltage (I*V) dissipates. The current is the one needed for charging or uncharging the gate of the transistor with a certain treshold charge Q, or in formulas I = dQ/dt, with t the time. The gate acts as a condensator and thus electronic theory says that Q = C*V with C a constant. Electronic theory also states that the derivative of the voltage with respect to the time equals 2*pi * the clock frequency * V, or in formula notation: dV/dt = = 2*pi*f*V, with f the clock frequency. I summarise the equations:
  1. P = I*V
  2. I = dQ/dt
  3. Q = C*V
  4. dV/dt = 2*pi*f*V
Combination of equation 2 and 3, with C a constant results in I = C*dV/dt. Combination of this equation with equation 4 results in I = C'*f*V with C' = C*2*pi. Combination of this equation with equation 1 results in P = C'*f*V*V. This implies that if you double the clock frequency, you double the needed power.
While scaling the size of transistors exponentially, the treshold charge Q does not scale at the same rate. Therefore (nowadays) the powerconsumption of chips is the limiting factor on clock speed and the clock speed does not progress anymore according to Moore's law.
As the transistor scaling continues, it is possible to have multiple processor cores to make parallel calculations. However, as there is a limit on the power consumption, and the pace of the scaling is faster than the reduction in power efficiency, we will rather see multiple cores with a declining clock speed instead. The overall performance of processors does not follow the progress in scaling anymore, it is lagging behind.
--Toon Macharis 03:24, 9 August 2006 (UTC)[reply]

What Erik Sandberg wrote is quite reasonable. In fact my intention of visiting this discussion page was adding the limiting factor of speed of electricity. First, speed of electricity is lower than light speed. And, there is also a miniaturization limit. A transistor can't be smaller than an atom. Another important factor is memory. The power of computation depends on memory access speed. Maybe someday we can make a terahertz counter logic, but counting numbers is not a general purpose computation. The CPU has to access memory and memory takes big chip area, resulting slow access time. Thus, bigger the memory to access, slower access time, slower speed. The complex formulae above might be all true and I agree the first limit is power consumption today. But even if we supercool the chip and provide huge amount of power we can't pass a limit. It is "eternal computation limit". 1000 years later this limit will be there, there is no hope to pass it with this architecture. —Preceding unsigned comment added by 85.108.36.24 (talk) 21:21, 4 September 2007 (UTC) User(85.108.36.24): Cenk Tengerli. —Preceding unsigned comment added by 85.108.36.24 (talk) 21:26, 4 September 2007 (UTC)[reply]

Sematech

Would someone be able to create a Wikipedia entry for Sematech? -- Corvus 15:34, 26 July 2005 (UTC)[reply]

Meaning?

The quote says

The complexity for minimum component costs has increased...

What does "for minimum component costs" mean? Does he look at the chip whose cost per transistor is minimal, and counts the number of transistors on the chip? Or does he look at the chip whose total cost is minimal, and counts the number of transistors on the chip? Or does he count the number of transistors per square millimetre rather than per chip? So basically I have four variants of the law, and the only one that I believe has any hope of being true is

  • Take the chip whose cost per transistor is minimal, then count the number of transistors per square millimetre.

Is that the intended interpretation? Thanks, AxelBoldt 19:13, 26 October 2005 (UTC)[reply]

Moore's Law Debunked

At the head of the article we've got a very scientific looking graph that says transistor count doubles every 24 months, and as 194.202.76.66 noted above, scientific-looking graphs are the sine qua non of scientific proof. Meanwhile, the "most common" formulation of the law is that it doubles every 18 months, and the original formulation is that it doubles every 12 months. Despite this, a whole bunch of credulous, unquestioning fools contributing to this article have managed to maintain the tone and the idea that Moore's law is a true fact as opposed to a disproved urban legend at best and, less diplomatically, a lying piece of crap. User:Anon 08:22, 8 November 2005 (UTC)

The figures I used to produce the graph were from here. I would be very interested in adding some more processors to the graph. Does anyone have any good sources? -- Wgsimon 16:02, 9 November 2005 (UTC)[reply]

Personally, I don't have any problems with your graph. But personally, I think that your graph completely disproves Moore's law, even the 24 month version. I note that the Pentium III is way, way out of the curve and that it has barely more transitors than the Pentium II. I note that the Itanium actually has fewer transistors than the Pentium IV. And I note that with the later CPUs, you're including cache RAM, which gratuitously boosts the transistor count. 24.200.176.92 04:08, 13 November 2005 (UTC)[reply]

Of course, I hadn't counted on your graph showing a doubling every 20 months. The accepted rate is closer to 26 months. 24.200.176.92 04:50, 13 November 2005 (UTC)[reply]

NPOV versus scientifically proven facts

Some people seem to think that NPOV means we must support commonly believed misconceptions against the truth, even killing the truth. I hope most people don't believe this.

It is certainly worth noting in the article that the "Law" is nothing more than an observation that has held up in the post, with various caveats, which perhaps could be spelled out better. Moore himself has been caught offguard by the popularity of this so-called law. This can all be explained in the article in a NPOV manner, if proper sources can be found. Without all the information at our disposal that contradicts or qualifies the "Law", then you're right that the article is not as objective as it could be.—GraemeMcRaetalk 12:44, 10 November 2005 (UTC)[reply]

I note with interest a recent edit by 24.200.176.92, which contains the following POV paragraph:

Between these two extremes lie an entire constellation of interpretations, almost all of them empirically false. The process of picking and choosing a particular interpretation from among all the ones possible and carefully arranging facts in order to support it has no basis in empirical science. Such a process if applied by any serious researcher would be condemned as fabrication of evidence.

The gist of the paragraph seems to be that Moore's law is the result of bad science. It argues strongly against a point that wasn't even made by the original article. Moore's law was never proposed as a law of science on the level of, say, Newton's inverse-squared law of gravity. Rather, the law was described in the article's topic sentence (until these recent edits changed it) in a relatively NPOV manner: Moore's law is the empirical observation that ...

Like 24.200.176.92, I don't want an edit war, but the article can't stand as it is. Instead, the text must explain what Moore's law is, and what it is not. It is an empirical observation, was popularized by Moore, a co-founder of Intel. It is not science. It characterizes growth of some measures as having been expoential in the past, when viewed over the course of several years. It doesn't suggest that the growth has been smooth; quite the contrary. Each new innovation in the design of computer hardware causes a "step" in the measure of growth.

The objections raised by 24.200.176.92 deserve due consideration, and should result, in the end, in a more factual and balanced presentation of Moore's law -- not by lambasting purveyors of junk science, but rather by stating clearly what Moore's law really is, and what its limitations are.—GraemeMcRaetalk 15:23, 10 November 2005 (UTC)

It seems to me that the introduction should contain a brief statement of what Moore's Law claims, followed by a brief summary of the issues discussed in the body of the article. If there are opposing points of view then one should not try and reconcile them in the introduction. There is definitely a place for a Criticisms of Moore's Law section or similar. It is important that objections are sourced and are not original research.
The sentence at the top of this section states that,
Some people seem to think that NPOV means we must support commonly believed misconceptions against the truth, even killing the truth. I hope most people don't believe this.
If Moore's Law is a commonly believed misonception, that misconception should still be stated in a clear and unambiguous way. It should be possible to do so without supporting or opposing it. The introduction appears to be aimed at debunking Moore's Law rather than stating what it is.
-- Wgsimon 18:19, 10 November 2005 (UTC)[reply]
It isn't my contention that Moore's law is the result of bad science. Moore's law is the result of no science and a massive self-propaganda campaign finding very fertile ground among a bunch of wishful-thinking types. Now, the "evidence" for Moore's law is the result of bad science but evidence doesn't really figure in the popular education and dissemination of Moore's law. What figures into it is bald, stone cold, mindless repetition. Moore's law is more a subject for social science than economics.
As for the contention that I was attacking the belief that Moore's law was a natural physical law ... oh come on! Could you have come up with a more credible red herring? First of all, Moore's law was originally proposed as an economic law and that's exactly how I treated it. Now, economic laws aren't inviolable but they're still supposed to be true as approximations. Moore's law isn't even true as an approximation (ie, if you assume a constant exponential) only as a generality.
And secondly, it seems to me that people are quick to forget that "empirical" without any qualifiers, is generally interpreted as meaning "true". Especially so when used in conjunction with "observation" and other scientific-sounding terms. So "the empirical observation that blah blah" tends to mean, to most people, that blah blah is a true fact. The category of "false empirical observations" doesn't really register in people's minds unless it's explicitly pointed out with words such as false, myth, delusion, illusion, disproved, unproved, et cetera. And then there are the really neutral ways of saying it like hypothesis, theory, conjecture, claim, statement, et cetera. Pick your poison, I'll take any one.
You know, these are really very standard words in science with very narrow, highly-codified meanings. And empirical -> true fact of reality. Oh yeah, and the codified meaning of "law" is a concrete statement (such as "exactly 90% of everything is crap"). The correct code word for a generality (such as Moore's law is) is 'principle'. The Moore principle wouldn't have implied that there is a rigorous interpretation of it, whereas the very name "Moore's law" implies that there is. Since this is factually wrong, Moore's law is simply false. Well, technically it's "merely" an oxymoron but being an oxymoron is a really bad thing to be for something that claims to be a Law.
Finally, I don't think it's too much to ask of an introduction that it include the single word false in there somewhere. I mean, you've got a whole phrase about who it's attributed to. What's a single word, on an infinitely more important aspect than the mistaken origins of Moore's law, compared to an entire phrase? 24.200.176.92 03:59, 13 November 2005 (UTC)[reply]

First of all, the term "Moore's law" is widely used and it is not for us to change it. The common usage of the term "law" extends beyond a proven scientific principle, e.g. Murphy's law. The real problem here seems to be coming up with a valid single benchmark for measuring progress in computing. This parallels other problems in econometrics, such as measuring poverty or inflation. Often arbitrary decisions have to made to come up with a number that can be tracked, however, no one seems to have done this for the computer industry. And arbitrary decisions can cause difficulties over long time periods. The shift to onboard cache leads to higher transistor counts, on the other hand, the shift to hardwired instructions instead of microcode, may have gone the other way. What processor type to use as a benchmark, and when in its life cycle, is another problem. And the goal itself keeps shifting. Apple's switch to Intel was justified on Apple's realization that MIPS per watt is a more important consideration than MIPS per chip. I therefore think it is misleading to say Moore's law is false. It was never intended as a exact prediction, but as a rough approximation and in that sense it seems to have held up pretty well. --agr 20:50, 14 November 2005 (UTC)[reply]

Okay, "first of all" Moore's law is a bunch of crap. And the "real problem" is that Moore's law is a bunch of crap. Now, I don't know about you but I don't think that a wikipedia article is an appropriate place for you to put up your personal opinions and your own original research. Because that's exactly what you're advocating when you say that "the real problem here seems to be coming up with a valid single benchmark for measuring progress in computing" as if it's anyone here's job to do so. It isn't, it's our job to point out that nobody else has ever managed to do so. Everything else you say is so much meaningless droning blah blah blah. And "it is misleading to say Moore's law is false" WHAT THE FUCK? And "it was never intended as an exact prediction, but as a ...." Someone should spank you for blatantly contradicting the facts as we both know them. 24.200.176.92 17:33, 15 November 2005 (UTC)[reply]

Moore's law for IQ when people will be genetically engineered

When we will find the genes that affect intelligency, it is possible that will will start genetically engineer ourselves to more intelligent. It is reasonable that these new more intelligent people will be able to even more improve the intelligency of their descendents, simply because they will be more intelligent. That closed loop of self-reengineering and self-improvement, which is likely to be exponential because of two factors improving each other - just like the computing power makes designing computing power exponential, the brain power would make improving brain power exponential.


^ What the hell is this hogwash? Why is it here? Darren

What do you find to be hogwash about it? If genes have something to do with intelligence (which seems a plausible conjecture, although by no means entirely proven, given that human genes differ from chimpanzee genes, and the two species also differ in average intelligence), if such genes are susceptible to genetic engineering, then it seems quite reasonable to wonder if a feedback loop will occur, in which scientists make themselves increasingly smarter, enabling the augmented scientists to invent ways to make themselves smarter still. This seems directly analogous to a process which helps drive Moore's Law: engineers using the current generation of computers to design the next generation of still-faster chips. The whole mental phenomenon of invention has been only vaguely understood. Throughout history, humans have depended on the random appearance of rare people who somehow happen to have inventive skill (and not all cultures have been equally hospitable to inventive types). This is similar to the way hunters depend on the happenstance appearance of wild game, in contrast to farmers, who alter the environment to make it more productive for human use. If human intelligence can be engineered, it might trigger a cultural change comparable to the switch from hunting to agriculture. Anyway, the relation to Moore's Law is that genetic engineering technology relies heavily on computers; as computers get better, they drive progress in fields that depend on computers. --Teratornis 01:38, 10 February 2007 (UTC)[reply]

Phenomenology

"Moore's Law" is Phenomenology. This is a scientific word meaning "I drew a curve through the points by eye" :-). No baggage about "validity", no claims other than "you can see that the curve goes more or less near the points". Then, as a separate second act, there is Extrapolation. This is widely recognized as involving Implicit Assumptions, and if you want even minimal credibility, you are obliged to make the assumptions explicit; and because the assumptions are not data, what you are doing at this point is Speculation not Science. Of course, in a country where science is a spectator sport, all of this is easily overlooked.

Why does this article get the basic facts wrong in its opening statement?

Moore's "Law" is not about computing power. It's about the number of features (e.g., transistors) that can be placed on an area of silicon.

What's the easiest way to permanently correct this article?

Press the edit button and then get going with your keyboard. If you make mistakes others will correct you, SqueakBox 17:34, 27 December 2005 (UTC)[reply]
The error was introduced in November 2005 by User:24.200.176.92. There is no way to permenently correct the article. Mirror Vax 17:46, 27 December 2005 (UTC)[reply]

Well except by reediting it afresh, SqueakBox 17:48, 27 December 2005 (UTC)[reply]

Or we could revert it to [2]. This is what the anon did [3], SqueakBox 17:52, 27 December 2005 (UTC)[reply]

Improving image

The chart is nice and all, but it's very Intel-specific. Maybe we could put some other processors on there too, like AMD? Last I checked AMD had more of the desktop CPU market than Intel did! -- 04:25, 5 March 2006 (UTC)

That would be good. If anyone has a link to information about other processors, please let me know. -- Wgsimon 19:13, 12 March 2006 (UTC)[reply]
How about this page right here?
Excellent -- Wgsimon 15:06, 15 June 2006 (UTC)[reply]

I actually wandered over to the talk page on a similar issue. I'm no computer expert, but AMD has 65 nm chips this year too, right? I feel like that's worth mentioning. I feel that the article is generally Intel-biased (although not enough to warrant a NPOV tag or anything), and I might come back to it later if I can do enough research. --ZachPruckowski 06:15, 9 April 2006 (UTC)[reply]

There's a reason the "law" does not mention non-intel chips. The law is purely a marketing tool used by - you've guessed it - Intel! To try to add other processors to it, not only appears to give the thing some objective justification it does not deserve, but also constitutes original research. --Necessaryx 15:30, 9 July 2006 (UTC)[reply]
"Moore's Law is the 1965 prediction by Gordon Moore (co-founder of Intel) that the transistor density of semiconductor chips would double roughly every 18 months." (markup removed)
Actually, this is similar to other definitions I've seen in that it doesn't mention any chip companies. Why then do we have to restrict the article's analysis to Intel chips? Brian Jason Drake 10:23, 18 July 2006 (UTC)[reply]

Well I dunno if this is out of place, but the image of Moore's law seems a bit outdated. (No not the note he wrote by hand but the other ofcourse. ;)) I mean, this graph shows Intel processors up till 2004 but at the time of this post it's 2007 and a cr*p load of new Intel chips are already on market. (Dualcore, quadcore even...) So isn't that note worthy? Same goes for the image about the HDD capacities. 14:09 (UTC) 29 March 2007

"As of Q1 2006..."

Since I am an Australian, when I saw that I initially thought July 2006, but that seems a bit far into the future... Brian Jason Drake

Kryder's Law

I edited the paragraph on Kryder's Law to reflect the discussion going on there (namely, it is not better but equal to Moore's, as well as it hasn't been doing to well lately). David Souther

Get ready to edit it again. Kryder's Law is on AfD, and undergoing heavy edits. How would you feel about a merge of Kryder's Law into a section of this article dealing with laws affecting other computing parts. Maybe move the Wirth's Law reference into that section to, as well as the few sentences about other parts in the same section Kryder's Law is now? --ZachPruckowski 08:42, 9 April 2006 (UTC)[reply]

18 or 24 months?

Moore's law is not ambiguous at all. It was stated that the number of components on microprocessors doubles every two years, not 18 months. There is confusion with the doubling of performance, which is indeed supposed to double every 18 months (but really does every 20 months or so).

See this transcript of a conversation with Gordon Moore, more specifically paragraph 3:

ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf

QcRef87 19:12, 2 May 2006 (UTC)[reply]

Yep that makes sense. Everyone who usually speaks of 18month also says performance. Be bold include the source and edit it yourself.Slicky 06:18, 17 September 2006 (UTC)[reply]
There's a source already there, but people that don't bother to read change the number to 18 on occasion. Fixed again. — Aluvus t/c 06:28, 17 September 2006 (UTC)[reply]

Why, if this is so clear, does this article use the 18 month number so many times?? 205.157.110.11 23:00, 25 May 2007 (UTC)[reply]

I made some fixes, including a comment in the lede warning people not to change it back to 18 ... richi 21:12, 24 September 2007 (UTC)[reply]

Removal of reference to "Self-Fulfilling Prophecy"

I would like to suggest the removal of Moore's Law being quoted as a self-fulfilling prophecy in the Industry Driver section. It clearly cannot be considered an sfp due to the fact that it is an ongoing condition, one in which it may today hold true but tomorrow become false. A prophecy by its very nature is absolute and not something which is constantly ongoing. Enigmatical 04:20, 9 May 2006 (UTC)[reply]

Engelbart reference

This sentence: However, Moore had heard Douglas Engelbart's similar observation, possibly in 1965. and the next sentence, have been in most versions of the article since April 17, 2005. The "similar observation" attributed a date of "possibly 1960" by its original editor, which was evidently changed to 1965 to conform better with the fact that ICs were not invented until 1959. What source can be given to substantiate this entry? --Blainster 22:47, 10 June 2006 (UTC)[reply]

OK, did the search myself. The Engelbart entry was prompted by this NYTimes article of April 17, 2005. It is interesting, but it merely says that Engelbart gave some talks in 1960 that expressed the conviction that ICs would become smaller. He suggested no development rate, which is what Moore's law is about. I think the Engelbart entry should be removed from the intro, and if kept at all, be moved to the history section. --Blainster 23:01, 10 June 2006 (UTC)[reply]

the more things change...


the more they remain the same. Trying to build complexity in a system is not worth it.

To get to A-Class, this may be helpful

The following suggestions were generated by a semi-automatic javascript program, and may or may not be accurate for the article in question.

  • Please expand the lead to conform with guidelines at WP:LEAD. The article should have an appropriate number of paragraphs as is shown on WP:LEAD, and should adequately summarize the article.
  • Per WP:CONTEXT and WP:MOSDATE, months and days of the week generally should not be linked. Years, decades, and centuries can be linked if they provide context for the article.
  • There may be an applicable infobox for this article. For example, see Template:Infobox Biography, Template:Infobox School, or Template:Infobox City. (Note that there might not be an applicable infobox; remember that these suggestions are not generated manually)
  • Per WP:MOSNUM, there should be a non-breaking space -   between a number and the unit of measurement. For example, instead of 18mm, use 18 mm, which when you are editing the page, should look like: 18 mm.
  • Per WP:MOSNUM, when doing conversions, please use standard abbreviations: for example, miles -> mi, kilometers squared -> km2, and pounds -> lb.
  • Per WP:MOSNUM, please spell out source units of measurements in text; for example, "the Moon is 380,000 kilometres (240,000 mi) from Earth.
  • Per WP:MOS#Headings, headings generally do not start with the word "The". For example, ==The Biography== would be changed to ==Biography==.
  • Please alphabetize the categories and/or interlanguage links.
  • There are a few occurrences of weasel words in this article- please observe WP:AWT. Certain phrases should specify exactly who supports, considers, believes, etc., such a view. For example,
    • it has been
    • might be weasel words, and should be provided with proper citations (if they already do, or are not weasel terms, please strike this comment).
  • As is done in WP:FOOTNOTE, for footnotes, the footnote should be located right after the punctuation mark, such that there is no space inbetween. For example, change blah blah [2]. to blah blah.[2]
  • Please ensure that the article has gone through a thorough copyediting so that the it exemplifies some of Wikipedia's best work. See also User:Tony1/How to satisfy Criterion 2a.

You may wish to browse through User:AndyZ/Suggestions (and the javascript checklist; see the last paragraph in the lead) for further ideas. Thanks, Wim van Dorst (Talk) 22:14, 26 June 2006 (UTC)[reply]

Moore's Second Law

I've just read about the less known "Moore's Second Law". [4] [5] I don't know whether this Second Law deserves a new article or it should be explained in this one. Any thoughs? --surueña 18:57, 31 August 2006 (UTC)[reply]

YHz?????

I'm just starting out programming C++, and I wrote a console application about Moore's Law that produced this text:

  • Year: 2006 GHz:2
  • Year: 2008 GHz:4
  • Year: 2010 GHz:8
  • Year: 2012 GHz:16
  • .....
  • Year: 2032 THz:16.384
  • Year: 2034 THz:32.768
  • Year: 2036 THz:65.536
  • Year: 2038 THz:131.072
  • .....
  • Year: 2114 YHz:36028.8
  • Year: 2116 YHz:72057.6
  • Year: 2118 YHz:144115
  • Year: 2120 YHz:288230
  • Year: 2122 YHz:576461

Now, according to what I've read on the Talk Page, this is 1000%(sic) not possible. How could this be considered a law?
(this is based on transistor counts per processor doubling every 2 years).
(I'm not sure that 2 GHz for an average computer is true in 2006, but...) —The preceding unsigned comment was added by 24.49.75.47 (talk) 14:46, 27 December 2006 (UTC).[reply]

  • Again, Moore was speaking of the number of switches per processor, not the speed of the processor itself (although the two can be related). Assuming 50 transistors per square inch in 1965 http://cactus.eas.asu.edu/Partha/Columns/09-03-Moore.htm), here's a more debatable chart plot going out to 2122 (badly assuming, of course, that binary switching will still be a relevant concept in 100+ years):
  • 1965 50
  • 1967 100
  • 1969 200
  • 1971 400
  • 1973 800
  • 1975 1,600
  • 1977 3,200
  • 1979 6,400
  • 1981 12,800
  • 1983 25,600
  • 1985 51,200
  • 1987 102,400
  • 1989 204,800
  • 1991 409,600
  • 1993 819,200
  • 1995 1,638,400
  • 1997 3,276,800
  • 1999 6,553,600
  • 2001 13,107,200
  • 2003 26,214,400
  • 2005 52,428,800
  • 2007 104,857,600
  • 2009 209,715,200
  • 2011 419,430,400
  • 2013 838,860,800
  • 2015 1,677,721,600
  • 2017 3,355,443,200
  • 2019 6,710,886,400
  • 2021 13,421,772,800
  • 2023 26,843,545,600
  • 2025 53,687,091,200
  • 2027 107,374,182,400
  • 2029 214,748,364,800
  • 2031 429,496,729,600
  • 2033 858,993,459,200
  • 2035 1,717,986,918,400
  • 2037 3,435,973,836,800
  • 2039 6,871,947,673,600
  • 2041 13,743,895,347,200
  • 2043 27,487,790,694,400
  • 2045 54,975,581,388,800
  • 2047 109,951,162,777,600
  • 2049 219,902,325,555,200


Somebody might want to update this with information on what actually happened between 1947 and 2007. The year of "1 switch" has got to be off, at the very least. I'm also semi-sure that the "2007" number has to be off by a factor of 10. 147.145.40.43 19:31, 26 April 2007 (UTC)[reply]

Believe it or not, we are actually at 820 million right now (Intel Yorkfield), which is 8 times more than the prediction above. --68.147.51.204 (talk) 09:13, 12 December 2007 (UTC)[reply]

The predition is transitors per Square inch, not transitors, so what is the area of the die of yourkfield, and how many layers - then you can work out the transitors/square inch --152.78.202.157 (talk) 11:08, 13 December 2007 (BST)

FPNI statement

I just removed the following statement that was just added: In January 2006 Hewlett-Packard announced a new technique using an architecture called “field programmable nanowire interconnect (FPNI)” that promises to jump three generations forward, in violation of Moore's Law.

I removed it because it was in the wrong section (Early forms, rather than Future trends) and that it was unsourced. The source I can find (here) doesn't support the assertion that Moore's Law has been violated. By the time this experimental technique is used to produce chips, it should fall right into line. -Amatulic 23:54, 16 January 2007 (UTC)[reply]

Telecommunication cost

Might Moore's law#Formulations of Moore's Law say something about telecommunication cost trends? Futurists#Future thinking mentions "the fall of telecom costs towards zero," but provides no link. I saw nothing here, nor in telecommunication about the long-term historical cost trends for telecommunication. Might a graph similar to Kurzweil's show the cost of telecommunication dropping similarly over multiple paradigm shifts, from relay couriers to signal towers to telegraphs, up to fiber optics today? Does anyone know of suitable references? Thank you for your time. --Teratornis 02:18, 10 February 2007 (UTC)[reply]

Exponential power increase with increasing clock speed?

"This occurs because higher clock speeds correspond to exponential increases in temperature, making it possible to have a CPU that is capable of running at 4.1 GHz for only a couple hundred dollars (using practical, yet uncommon methods of cooling), but it is almost impossible to produce a CPU that runs reliably at speeds higher than 4.3 GHz or so."

The above entire paragraph seems to be bullshit.


Contrast with peak oil?

Peak Oil (or Hubbert's Curve) is similar to Moore's Law in that both are empirical observations about long-term resource or technology trends with far-reaching implications for society. However, while Moore's Law seems to violate Murphy's Law, Peak Oil obeys it with a vengeance. The two "laws" are related in that energy and information are to some extent interchangeable factors of production; the cost of information steadily falls due to Moore's Law, while the cost of energy appears likely to steadily increase, at some point, due to Hubbert's Curve, barring some seemingly unlikely breakthrough in energy technology (for the most part, a mature industry, where the pace of innovation is slow). Therefore, the challenge before technologists is to find ways to substitute information for energy, wherever possible. Sooner or later, the solution that substitutes information for (some) energy must become more profitable. For example, executives can meet via videoconferencing instead of flying around in business jets. At some point the overall economics must begin to favor videoconferencing/telecommuting/telepresence over physical travel by information workers, although the exact time when this occurs will obviously differ with the application, and there will be the usual cultural lag.

I'd like to mention the above observation, which seems plainly self-evident to little old POV me, but to avoid violating WP:OR I need to find some citations that discuss the information/energy tradeoff. I read about this sort of thing decades ago in some futurism-type books whose titles I don't recall (even before the Internet the economic shift from energy to information was obvious). There was also a Scientific American article, I think, some years back which discussed the steadily-falling energy input per unit of GDP in industrial economies. I'll do some looking, but if anyone else is aware of such thinking and can mention some references, I'd appreciate it. --Teratornis 02:18, 10 February 2007 (UTC)[reply]

Regarding "Misconception"

Someone edited the page to include a short opinion about labeling Moore's Law as a theory rather than a "Law". Understandably, their opinion is valid but Moore's Law is universally known by that title, and further, follows man other maxims with similar common names (e.g. Murphy's Law)

Also it seems like a bit of original research to me.

Anyone care if it's removed? Ar-wiki 12:49, 20 March 2007 (UTC)[reply]

Quote from Kurzweil (about there being 5 generations of exponential growth in computers)

I corrected an error in attributing Bletchley Park's "Heath Robinson" codebreaking machine to Turing (it was Max Newman's work). However - I didn't realise that I had corrected a verbatim quote from Kurzweil's article listed as [11] in the references.

I assume that my "correction" will have to come out of the verbatim quote, but that would leave us with a factual error in the resulting encyclopedia article. What's to do - point out the error within the quote with a "(sic)" note followed by a line or two pointing out the correction? What's the correct procedure here?

How about just using brackets? [Heath Robinson]

By the way, Moore's law breaks down long before 600 years. I give it less than a hundred years.

Removed unsourced digression

I am trying to improve the clarity of this article (especially for less technical readers). I removed this paragraph because it was a needless digression. It is also unsourced and (I'm guessing) somewhat controversial, so it should be removed for WP:VER. ---- CharlesGillingham 12:23, 3 September 2007 (UTC)[reply]

Under the assumption that chip "complexity" is proportional to the number of transistors, regardless of what they do, the law has largely held the test of time to date. However, one could argue that the per-transistor complexity is less in large RAM cache arrays than in execution units. From this perspective, the validity of one formulation of Moore's Law may be more questionable.

Forest and Trees

I've made a number of changes that are designed to make this article clearer and more accessible to the general reader. I have rearranged re-titled some of the sections, removed a digression and written a new intro. The main idea here is avoid getting into detailed technical discussions and disputes in the early sections of the article, and to put Moore's Law into it's historical context. The article should make sense at first glance to any educated reader.

There is more to do: there is still some repetition and some paragraphs are almost undecipherable to someone who is not familiar with chip manufacture. (See the guideline Wikipedia:Explain jargon). Also, some sections need an editor to attempt to make sure the entire section is complete and focussed. Does every one of these observations have to be here? Is there anything important being left out? ---- CharlesGillingham 19:01, 3 September 2007 (UTC)[reply]

Force or Result? Cause or Effect?

Am I the only one who has a problem with the last sentence of the introduction?

Moore's Law is a driving force of technological and social change ...

Moore's Law is an empirical observation and as such describes and effect, not a cause. I would like this sentence removed and will do so within a week unless someone has a better suggestion. --Tjconsult 22:48, 20 September 2007 (UTC)[reply]

Fixed. Is this okay with you? (The "driving force of social change" here is actually the "increasing usefulness of digital electronics" and moore's law "describes it" by roughly quantifying the rate of change) ---- CharlesGillingham 01:34, 25 September 2007 (UTC)[reply]
Thanks. --Tjconsult 04:04, 1 October 2007 (UTC)[reply]

Oversupplied or Overpriced?

WP:NPOV: Dark fiber overcapacity and the optical network bandwidth oversupply greatly exceeding demand of even the most optimistic forecasts by a factor of up to 30 in many areas.

I never hear the word "overcapacity" applied to roads, or water. In both those cases, most infrastructure is publicly owned or governed, even if built by private firms. Bandwidth isn't "oversupplied", it's overpriced. Several US municipal agencies have demonstrated that -- by building infrastructure and offering bandwidth priced below private suppliers. Network design & deployment is a notable investment hurdle; and sadly, many of our governance mechanisms are not structured to express "the will of the people" or to strongly act "for the people" in that way. Data pipes have become as fundamental and universally needed as water & sewer pipes became in the past. Similar management and control issues apply, and we should follow the same provision models.

However, this Moore's Law article is not the place to engage that complex issue ... and definitely not the place for language that proselytizes for the theft of the commons. Lonestarnot 06:35, 21 September 2007 (UTC)[reply]

18 or 24 months?

Moore's law is not ambiguous at all. It was stated that the number of components on microprocessors doubles every two years, not 18 months. There is confusion with the doubling of performance, which is indeed supposed to double every 18 months (but really does every 20 months or so).

See this transcript of a conversation with Gordon Moore, more specifically paragraph 3:

ftp://download.intel.com/museum/Moores_Law/Video-Transcripts/Excepts_A_Conversation_with_Gordon_Moore.pdf

QcRef87 19:12, 2 May 2006 (UTC)[reply]

Yep that makes sense. Everyone who usually speaks of 18month also says performance. Be bold include the source and edit it yourself.Slicky 06:18, 17 September 2006 (UTC)[reply]
There's a source already there, but people that don't bother to read change the number to 18 on occasion. Fixed again. — Aluvus t/c 06:28, 17 September 2006 (UTC)[reply]

Why, if this is so clear, does this article use the 18 month number so many times?? 205.157.110.11 23:00, 25 May 2007 (UTC)[reply]

I made some fixes, including a comment in the lede warning people not to change it back to 18 ... richi 21:12, 24 September 2007 (UTC)[reply]
I moved this into a footnote. (see next section) ---- CharlesGillingham 01:29, 25 September 2007 (UTC)[reply]

Forest & Trees

I think it's important to keep the lead as clear as possible. Someone who arrives here trying to find out what Moore's Law is doesn't need to know messy details unless it's absolutely necessary for accuracy. There is plenty of room for refine the definition and hash out details throughout the rest of the article. The lead should be written primarily for readers who are completely unfamiliar with Moore's law.

(The importance of Moore's Law is much wider than the chip industry. It has the same kind of significance as trends like the green revolution, globalization or urbanization, and so it gets mentioned occasionally in history, sociology and other humanities. The lead should be written primarily to help these readers, who may have no familiarity at all with the law, it's history or importance, and may know next to nothing about digital electronics. The rest of the article is for readers who are interested in the details.) ---- CharlesGillingham 01:29, 25 September 2007 (UTC)[reply]

Updated diagram?

Anyone know the transistor counts for, say, the Core 2 line of chips so we can update that diagram on the top of the page and see if the law has been holding for the past several years? Sloverlord 17:38, 18 October 2007 (UTC)[reply]

Removed spurious reference document

Removed the link: The Impact of Pervasive Symmetries on Hardware and Architecture from the Data reference section. The document is a fake machine-generated paper. —Preceding unsigned comment added by 130.235.34.47 (talk) 03:46, 1 November 2007 (UTC)[reply]

Fair use rationale for Image:GordonMooresOriginalGraphFrom1965.PNG

Image:GordonMooresOriginalGraphFrom1965.PNG is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.

Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Wikipedia:Fair use rationale guideline is an easy way to insure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images uploaded after 4 May, 2006, and lacking such an explanation will be deleted one week after they have been uploaded, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.

BetacommandBot 06:54, 7 November 2007 (UTC)[reply]

Fixed this (I hope) by using the same templates as Image:Valeri borzov.jpg ---- CharlesGillingham 19:29, 7 November 2007 (UTC)[reply]
I guess I failed. I'm not sure why my template was insufficient. One bot marked the image for deletion, another bot deleted it. I'm not sure if the second bot even noticed that I'd tried to fix this. Does anybody understand how this works well enough to try to get the image back? ---- CharlesGillingham 04:40, 4 December 2007 (UTC)[reply]

LCD screens do not follow Moore's law

The page currently states that LCD screens also follow Moore's law. This is patently incorrect. In 1997, the standard laptop display had a resolution of between 640x480 and 1024x768 pixels. By 2000, it had increased to between 1920x1440 and 1024x768. Since then, it has decreased: the maximum resolution you're likely to find is 1920x1200, and it's very rare to find anything more on the desktop. If Moore's law had applied over the last 10 years, we would now have displays ranging between 3600x2700 and 5600x4200.

The reason this hasn't happened is clear: the web's holding it back. Most web sites don't even render correctly at 1920x1440, and the tendency seems to be to more breakage. HTML is holding back progress here.

I've removed the reference to LCD screens on the main page. Groogle 00:05, 10 November 2007 (UTC)[reply]

It is the market that is holding it back - no one wants to buy hi-res monitors because they don't realize that if they had them they could read 30% faster, just like on printed paper (at 300 dpi). When the public figures this out you will see the standard at about 5600x4200, just like you predicted. 199.125.109.48 (talk) 18:08, 10 February 2008 (UTC)[reply]

Processor Transistor Count, Processor Cache, and Processor Performance

There is a misconception in "Software: breaking the law" section in the article about the relationship between the growth of processor transistor count, and processing power. This section, along with other sections of the article assumes that processing power increases proportionally, specifically at the rate of Moore's Law, to processor transistor count, which is false.

The majority of transistors on a newer processor is allocated to the processor (on-die) cache, in fact processor cache has been known to account for ~45% of a processor's transistor count. The Pentium M Banias has ~77 mil. transistors while Dothan (doubled L2 cache, die-shrinked Banias with minor improvements) has ~140 mil. transistors, however the performance difference between the 2 processors is only around 10-20% (compared to 45% increase in transistors), this proves the disparity between processor power and processor transistor count. There also cases (see benchmarks between Intel Pentium 4 Prescott-2M and Prescott) where an increased amount of cache (more transistor count) has provided negligible or even a decrease in performance (the issue is related to cache size and latency).

Although there is a general trend that increased transistor count of a processor means increased peromance, the rate at which both of these factors change are different. Specifically, processor performance have several factors that one needs to consider such as intergrated memory controller, processor microarchitecture (processor effeciency), and clockspeed (the difference of price for processors is usually not the difference of processors themselves, but the clockspeed at which the chip is set to run at, which affects performance, processor stability at high clockspeeds is also an issue), all of which can alter the determination of a processor's power or performance independent of transistor count.

This misconception of processing power and its association with Moore's Law is fairly wide spread (reason below) and should be noted in the article. Part of the reason that it's so wide spread is because when people see the Moore's law graph, they often associate with what they understand (The brand names of the processors i.e.) Pentium 4) rather than what's important (understanding of transistor count). Since the Moore's Law graph shows the chronological order of intel's processor in the correct order (mostly) (i.e.) Pentium 4 is plotted after Pentium 3, which is plotted after Pentium 2, each having more transistors than their predecessors) and people know for a fact that pentium 4 is more powerful than pentium 3, which is more powerful than a pentium 2, the misconception of transistor count and performance is obvious. Also, one has to have a certain degree of knowledge about processors to understand the difference between transistor count and performance, which makes the issue all the more notable. --68.147.51.204 (talk) 08:58, 12 December 2007 (UTC)[reply]

I've added a mention about this in the other considerations section of the article. --136.159.209.101 (talk) 18:53, 12 December 2007 (UTC)[reply]

I believe the article only claims that all the various measures are increasing exponentially, not that they are increasing at the same exponential rate (or a proportional rate). The article also takes pains to describe the trend as "approximate" and "rough" (in the introduction, anyway).
In some of the other sections, if there are misstatements, you should correct them. I think this is an easy fix. (For example, "Self-fulfilling prophecy" states that the industry aims for "a specified increase in processing power". Should this say "a specified increase in transistor density"? Or are both statements true?) ---- CharlesGillingham (talk) 19:34, 14 December 2007 (UTC)[reply]

performance and power

Can we please replaced the word power with performance where the latter is meant. I see a potential confusion with power consumption that is also used in the article. Andries (talk) 10:09, 4 February 2008 (UTC)[reply]

Cost per unit area

While cost per transistor is being reduced by shrinking transistor size, the cost per unit area is not necessarily shrinking. That is because each new generation of product adds more and more features and layers into the same square centimeter. By considering how much expensive material is deposited and removed per square centimeter in an Intel 45 nm processor vs. an Intel quarter-micron processor, the point should be clear. The graph added for Economic Impact of Moore's Law should describe this pictorially. Guiding light (talk) 10:19, 24 March 2008 (UTC)[reply]

This issue of increasing expense is touched on in the section about industry (A self fulfilling prophecy), and could be covered in more detail there. There could also be a new section that details various aspects of digital technology that are not improving exponentially. If such a section was started, in should subsume the section on software. ---- CharlesGillingham (talk) 15:29, 24 March 2008 (UTC)[reply]

There may be enough published information on the memristor now to add a note about it to the "Future trends" section. (Although our article Memristor IMHO needs some significant improvement.) -- 201.37.229.117 (talk) 17:28, 1 May 2008 (UTC)[reply]

Switching energy

It would nice to mention/link in how the switching energy for a single transistor has decreased over time. Couldn't find the appropriate page on Wikipedia. Does it exist?--Michael C. Price talk 20:24, 8 May 2008 (UTC)[reply]

Requested move

Moore's LawMoore's law — Unlike all the others in List of scientific laws named after people, this one has nonstandard capitalization of "Law"; let's fix it. —Dicklyon (talk) 06:12, 28 May 2008 (UTC)[reply]

Survey

Feel free to state your position on the renaming proposal by beginning a new line in this section with *'''Support''' or *'''Oppose''', then sign your comment with ~~~~. Since polling is not a substitute for discussion, please explain your reasons, taking into account Wikipedia's naming conventions.
  • Support – of course, as I proposed it. There's not reason this law in particular should be an exception to the usual wikipedia naming convention, which all the other eponymous laws follow. Dicklyon (talk) 04:19, 29 May 2008 (UTC)[reply]
  • Neutral - personally I think that the capitalized version is more appropriate, so if this is left capitalized, we should rename the other "laws" to be capitalized. 70.55.85.131 (talk) 04:14, 30 May 2008 (UTC)[reply]
The convention at WP:NAME is "Do not capitalize second and subsequent words unless the title is almost always capitalized in English (for example, proper names)." Moore's law is found in books not capitalized nearly half the time, so that's way off from "almost always" capitalized. I think this articulated convention should trump the common notion of what's "appropriate", in the wikipedia context. Dicklyon (talk) 04:47, 30 May 2008 (UTC)[reply]

Discussion

Any additional comments: