Jump to content

Talk:Microprocessor

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Please add {{WikiProject banner shell}} to this page and add the quality rating to that template instead of this project banner. See WP:PIQA for details.
WikiProject iconComputing C‑class Top‑importance
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
CThis article has been rated as C-class on Wikipedia's content assessment scale.
TopThis article has been rated as Top-importance on the project's importance scale.

The intro to this page makes no sense to someone without a background. —Preceding unsigned comment added by 71.163.67.111 (talk) 04:54, 6 June 2009 (UTC)[reply]

Concern about "GPU"

I'm a little wary that the article classifies GPUs as microprocessors. I have always seen the term "microprocessor" applied to an IC-based CPU. As I'm sure most readers realize, GPUs are much more akin to DSPs or stream processors than CPUs, despite the unfortunate acronym similarity. The programmability and general design model of GPUs certainly does not qualify it to be called a CPU. So my question is, is it appropriate to call a GPU a microprocessor, given that I've always known the term microprocessor to be related to CPUs? I'm not entirely sure, thoughts? -- uberpenguin 12:59, 20 October 2005 (UTC)[reply]

Okay, since nobody has ventured to add input to this concern, I'll just remove the offending text. -- uberpenguin 01:45, 18 December 2005 (UTC)[reply]
Hi, sorry - I didn't see this topic before I made my edit. Your concern seems to be that "microprocessor" should always mean "CPU", but I don't see any reason for that to be the case - they are not synonyms. Do you have any citations for a definition of "microprocessor" that would exclude GPUs? The fact that some GPUs are now used for non-graphics computation (see GPGPU) makes a pretty good case for their inclusion. MFNickster 02:16, 18 December 2005 (UTC)[reply]
There is no formal definition for the term 'microprocessor;' I'd be highly impressed if you could even find the first instance of its usage. I've personally never seen it used to include anything OTHER than CPUs. GPUs are called what they are, DSPs are called DSPs, etc. Just because GPUs perform arithmetic and are being used to a small extent as general purpose DSPs does not in itself qualify them as microprocessors in my mind (you wouldn't call a DSP a microprocessor, would you?).
I don't have to call a DSP a microprocessor; all I have to do is show examples of people in research and industry calling it that, which I have done. MFNickster 06:29, 20 December 2005 (UTC)[reply]
I guess I could turn your question back on you -- do you have any reference that suggests that a microprocessor is anything other than a CPU? It would be enough for me if you could find one or two major hardware vendors that classifies something as a microprocessor that isn't (or doesn't contain) a CPU. -- uberpenguin 02:24, 18 December 2005 (UTC)[reply]
Incidentally, FOLDOC only mentions CPUs in its definition of 'microprocessor.' 1 It's definition expands a bit from Wikipedia's, presumably to easily include microcontrollers and SoCs. -- uberpenguin 02:29, 18 December 2005 (UTC)[reply]
Fair enough! I would start with a dictionary definition from Merriam-Webster: "a computer processor contained on an integrated-circuit chip", which would include GPUs unless you define "computer" as a CPU only (circular logic) - but then, graphics computations are still computations.
Then I'd cite a few articles and pages:
[1] "ANTIC (locations 54272-54783) This chip is actually a specialized microprocessor in its own right. It controls the screen display through instructions to C/GTIA."
[2] "...if this still doesn't get them the required performance, go to a specialized microprocessor like a digital signal processor or even a custom microprocessor implemented in an application-specific integrated circuit — an ASIC."
[3] "A graphics processing unit (GPU) is a microprocessor that has been designed specifically for the processing of 3D graphics."
[4] "A DSP is a specialized microprocessor adept at high-speed arithmetic and real-time data transfer to and from the real world."
[5] "A DSP is a microprocessor designed to work with analog signals such as video or audio that have been digitally encoded."
[6] "...the microprocessor became most visible as the central processor of the personal computer. Microprocessors also play supporting roles within larger computers as smart controllers for graphics displays, storage devices, and high-speed printers."
On the opposing side:
[7] "The microprocessor is the central processing unit (CPU) fabricated on one or more chips"
[8] "microprocessor: a computer whose entire CPU is contained on one (or a small number of) integrated circuits"
[9] "A microprocessor generally means a CPU on a single silicon chip, but exceptions have been made (and are documented) when the CPU includes particularly interesting design idea..."
What do you think? I think there's evidence that they're not synonymous. If you can imagine a Venn diagram with a "CPU" circle and a "microprocessor" circle, and your definition being the shaded overlapping area. A "microprocessor" seems to be (at the least) a single-chip computer, but it doesn't have to be the CPU even if it has the capability. For instance, look at the Sega Saturn, it used a Motorola 68000 for the sound controller. MFNickster 03:00, 18 December 2005 (UTC)[reply]
First, M-W is hardly a definitive source for computer related information. In any case, we are talking specifically about the term 'microprocessor,' not 'processor.' Since, as I stated earlier, there is no formal definition of the term, we must go by what is common usage in the industry. DSPs and stream processors are both 'processors' that are never called 'microprocessors' (unless in some SoC form).
'Processor' is simply an abbreviation of 'microprocessor', at least in systems that use ICs. In older systems, it's short for 'processing unit'. MFNickster 07:03, 18 December 2005 (UTC)[reply]
Now, point by point: 1. Quick searching indicates that Atari never referred to ANTIC as a GPU; indeed most sources refer to it as a microprocessor by the merit that it could execute stored programs (something that no GPU can do by itself).
ANTIC is an example of a microprocessor which is not used as the CPU of a system. I think it supports my statement below that "GPUs contain CPUs", because it's a custom controller that meets the criteria for a microprocessor - it was never intended for use as a CPU. As far as I know, though, you are right that they never called it a GPU (that term came much later). But it is a graphics coprocessor, essentially the same thing. MFNickster 19:09, 21 December 2005 (UTC)[reply]
2. Why are you confusing ASICs with GPUs? What does that have to do with the question at hand? Do you know what this term actually refers to or are you just giving me googled links? 3. I've never heard of WAVE Report, and from what I can tell they aren't a manufacturer nor considered an authority in the field of digital microelectronics. 4. GPS World... That's a source for EE information? You'd trust one line of research done by someone not knowledgable in the field? 5. *sigh* See my complaints with #3 and #4. Additionally, I fail to see any point you're making as regards to the Sega Saturn. The M68k is undoubtedly a microprocessor and a CPU; it was simply used to control sound functions in that capacity. Perhaps some of this confusion lies in failing to separate design from functional capacity.
I'm starting to see where the confusion lies - and perhaps this is the solution to the quandary. A microprocessor is a physical unit, a chip, a component. A CPU is an abstration, defined by function instead of form. The 68k (actually a CMOS 68EC000 in the Saturn, I see) is a specific implementation of a CPU, though it can serve other functions. A PowerPC or Pentium, while being microprocessors, are much more - they contain functions traditionally separate from the CPU, such as floating-point units, MMUs, vector units. We could rewrite the intro to include the distinction between the physical chip (microprocessor) and the function it serves (CPU, FPU, DSP, GPU etc.). MFNickster 03:47, 18 December 2005 (UTC)[reply]
Yes you're right, but there's no confusion here. My issue is that you never will see someone refer to a FPU as a microprocessor
"A coprocessor is a second microprocessor that has been specially designed to perform a limited number of functions very quickly" [10] Need more? MFNickster 06:47, 20 December 2005 (UTC)[reply]
you'd rarely see a DSP called a microprocessor outside SoC applications, and to the best of my knowledge you'd never see a GPU referred to as a microprocessor by those in the industry. -- uberpenguin 04:02, 18 December 2005 (UTC)[reply]
Nvidia, The GeForce 6 SeriesGPU Architecture "Figure 30-1.The GeForce 6800 Microprocessor" [11] Need more? MFNickster 06:47, 20 December 2005 (UTC)[reply]
Then perhaps a distinction can be made between general-purpose and special-purpose microprocessors? (Incidentally, if there is no formal definition, why are you asking me to cite one? just curious :) MFNickster 04:26, 18 December 2005 (UTC)[reply]
Please don't give googled links to support a point based on predisposition. I'm looking for a respectable reference work; some research paper published by digital VLSI designers, or perhaps a whitepaper by a manufacturer. You can't just use Google to support a point of view (for example, Google would probably provide sufficient evidence for supporting the notion that CPUs are all microelectronic and only exist in the x86 form). Don't get me wrong now, I'm not trying to be caustic or jump all over you, but providing a list of links from unusable reference sources doesn't assist the discussion. -- uberpenguin 03:20, 18 December 2005 (UTC)[reply]
Yes, these are Googled links which are not intended as reference, but simply evidence that other people often use the term "microprocessor" to mean something other than a CPU, specifically because you said "I have always seen the term 'microprocessor' applied to an IC-based CPU." You are using your own experience as (dare I say) original research, and since you said that "there is no formal definition" these viewpoints are just as valid as yours. If you want to search for reputable references to back your own definition, you're certainly welcome to do so. That said, would you agree that in a general article it's better to be more inclusive unless there's reason to do otherwise? MFNickster 03:47, 18 December 2005 (UTC)[reply]
I phrased it like that because one can never be 100% sure that their position is correct, and I'd never want to come across that way. I find it unlikely that I could find any formal paper that explicitly spells out what a microprocessor is, because they assume the reader already has an idea of their intentions in usage. The point is that when manufacturers and researchers refer to microprocessors, I have never seen a case where they did not mean CPU. If you'd like me to dig up long winded papers that support this in a general fashion I can, but that hardly proves my point. It's easier for me to simply ask you to find a citable source that uses the term to mean something other than a CPU. -- uberpenguin 04:02, 18 December 2005 (UTC)[reply]
I'll look for one, but since "there is no formal definition" that gives us a lot of leeway in the article's scope. MFNickster 04:26, 18 December 2005 (UTC)[reply]
Here's something else to ponder (using google :). If you search for the terms GPU and microprocessor on major GPU designers' sites (e.g. ATI, NVidia, Matrox), initial inspection indicates that they themselves never refer to their products as microprocessors. In the case of difficult to define product terms, I think the tendencies of vendors is the best standard to go by. -- uberpenguin 03:26, 18 December 2005 (UTC)[reply]
Acutally, I did find a "FORM 40-F" for ATI which explicitly states "A GPU is a microprocessor specifically designed for processing 3D graphics data." I didn't include it because it's a PDF, but here is a http://tinyurl.com/aax27 TinyURL to the cached HTML version. MFNickster 03:47, 18 December 2005 (UTC)[reply]
Okay, now there's something that we can actually discuss. I think that might be a starting point, but I'm hesitant to consider it justification here because it's a legal document. While I'm not so hard-headed as to reject it on those grounds alone, I'd feel a lot better if we could find something written by researchers or designers (e.g. technical whitepapers) that uses this definition. I'll be looking for such a source myself... Right now I'm leaning towards adding text that points out that some people consider microelectronic GPUs, DSPs, etc to be 'microprocessors,' but in general the term is used to refer to CPUs and SoCs. -- uberpenguin 04:02, 18 December 2005 (UTC)[reply]
Well, they are microprocessors, if you take the word in a broad sense to mean "a chip that processes digital data", but as I pointed out, a CPU is really only part of a modern microprocessor, and such a chip doesn't have to be used as the CPU in a given system, so when you say "the term is used to refer to CPUs and SoCs", that is true but not the whole picture technically. Perhaps we can include a section for DSPs, GPUs, etc. describing how they were developed as specialized refinements of general-purpose microprocessors. The technology in silicon is basically the same, it's the function which differs. MFNickster 04:26, 18 December 2005 (UTC)[reply]
Well that's still the crux of our disagreement. I do not take microprocessor to be such a broad definition because it's very uncommon in my experience to see any researcher, engineer, or designer refer to anything that doesn't contain a CPU as a microprocessor... Indeed, so far it seems that the people that generally refer to GPUs as microprocessors are either lay men or something else far removed from a computer engineer. Ehh... I'll do a quick survey of some of the newsgroups and technical forums I frequent. I'll also try to see if I can find the first usage of the term anywhere. -- uberpenguin 15:36, 18 December 2005 (UTC)[reply]
I understand, and I think the point is not to determine "who's right", but instead to enlighten the reader with accurate information. The usage of 'CPU' has changed somewhat - originally the microprocessor was a way to implement a CPU on one chip; now the CPU has become one part of a microprocessor chip. In a very real sense, DSPs and GPUs do contain CPUs - they are just dedicated to a specialized purpose. But when someone refers to the microprocessor of a system, they are always referring to the general-purpose CPU and not the microcontrollers, FPUs, GPUs etc. in the system, so you are right about that usage. The article should also contain some sense of the broader meaning (microprocessors as a class of ICs). MFNickster 15:52, 18 December 2005 (UTC)[reply]
"In a very real sense, DSPs and GPUs do contain CPUs - they are just dedicated to a specialized purpose." By what definition of CPU? Certainly not a common one... Most people these days define a CPU as a turing complete stored program machine. Most DSPs and GPUs fail one or both these requirements (do you have any notable counterexamples?).
I think I spoke too soon on that one, in light of my earlier comment about 'CPU' being defined by its function, not its implementation. What I really mean is that DSPs and GPUs have logic cores similar to general-purpose microprocessors, and process data in a specialized way - i.e. the single-chip CPU had to be developed before programmable DSPs and GPUs could be made. MFNickster 06:29, 20 December 2005 (UTC)[reply]
Again, I simply have to disagree that the term microprocessor is commonly used to refer to ICs that don't function as CPUs. All publications I've ever read by the IEEE and computer architecture researchers seem to agree. The more general term "microchip" could certainly mean a DSP or GPU, but I still see no reason to think of a microprocessor as anything BUT a CPU, and the engineers I've talked to agree. I looked through the library's archive of old IEEE publications, and discovered that the very first issue of IEEE Micro (February 1981), the IEEE's bimonthly for microprocessor and microcomputer development, contains an article by M. Hoff and R. Noyce entitled "A History of Microprocessor Development at Intel." In it, Noyce states that the term "microprocessor" emerged at Intel in 1972 (not too long after the 4004 was released) and of course was used to refer to CPUs implemented as small multi-IC packages. So certainly the term originally was intended to mean CPU, and as of yet I have seen no sources from the IEEE or component designers that suggest that the meaning has changed since then.
1981? 1972? You're going to have to do better than that. Please find an IEEE article or paper that definitively says that microprocessors are CPUs and only CPUs. Better yet, try to find something on a DSP or GPU chip and see how it is described. MFNickster 00:05, 19 December 2005 (UTC)[reply]
Pardon? You don't consider the first usage of the term to be relevant? That's the hardest evidence supporting either position in this entire conversation thus far.
Actually it isn't, since the meaning has changed over time to include specialized processor chips. The articles you cite were written at a time when putting a whole CPU (without extras like FPU, MMU, cache) on a single chip was considered quite a feat in itself. There were no single-chip DSPs at that time, and even the term FPU was less common than "math coprocessor," a name for a microchip which is a kind of processor. The term "microprocessor" is just a combination of those two terms anyway. MFNickster 04:41, 19 December 2005 (UTC)[reply]
Again, you will not find any reference that says in explicit terms that "a microprocessor is and only is a CPU" because papers that use the term just assume that the reader knows what the author is talking about. I have certainly read plenty of technical papers involving both DSPs and CPUs and, as I've said several times, have never seen them referred to as microprocessors
Yes, I know that - but absence of evidence is not evidence of absence. So far all you have offered is "I've never seen it and nobody I know uses it that way." Not good enough, because I have seen it used that way. The links I provided are just some quick examples of the common usage. Unless you can find a formal definition, then the article should cover all bases. You can easily find lots more examples, but it's up to you to dig deeper. You'll have to convince yourself, I can't do it for you. MFNickster 04:41, 19 December 2005 (UTC)[reply]
Just citing one paper wouldn't be sufficient evidence to invalidate your position, but if that's all you want, I can certainly provide a couple. -- uberpenguin 03:20, 19 December 2005 (UTC)[reply]
Please do - that's all I ask, is that you support your position. Also, please make a note whether these papers are referring to a microprocessor, or the microprocessor (CPU) of a single-processor system. I would find that distinction interesting and relevant. MFNickster 04:56, 19 December 2005 (UTC)[reply]
Additionally, I don't think we have the right to take liberties with a fairly well established term just because it seems like its usage could be expanded to other devices. If no industry publication or manufacturer seems to commonly use 'microprocessor' to refer to non-CPU devices, then I don't see why this article should. -- uberpenguin 22:18, 18 December 2005 (UTC)[reply]
How "well-established" the term is is what we're debating, so you're begging the question by calling your definition the "well-established" one. I'm not saying it seems like its usage could be expanded, I'm arguing that it has been expanded. If you want some examples from manufacturers, here are a few (yes, they're Googled - our libraries don't open until tomorrow, sorry) MFNickster 04:41, 19 December 2005 (UTC)[reply]
  • Texas Instruments [12] "A digital signal processor (DSP) is a type of microprocessor - one that is incredibly fast and powerful."
  • Intel [13] "Digital Signal Processor (DSP) - A specialized digital microprocessor that performs calculations on digitized signals that were originally analog, and then forwards the results."
  • Intel [14] "DSP: 1. Digital signal processor. A specialized microprocessor designed to perform speedy and complex operations with digital signals."
  • IBM [15] A Microprocessor for Signal Processing, the RSP: "The Real-Time Signal Processor (RSP) is a microprocessor architecture that was created to exploit these characteristics in order to provide an expeditious and economical way to implement signal processing applications."

Halfway through making a list of papers from IEEE journals to demonstrate the term's usage, I decided that all this rhetoric is really silly over a minor terminology disagreement. I went ahead and wrote a section describing the usage of "microprocessor" to mean something other than a CPU; feel free to add to it or revise it as you see fit. I still hold that DSPs and GPUs are not in themselves microprocessors, but I doubt many people would have such issues with using the term thus. I do feel strongly, however, that when no further clarification is given, the term "microprocessor" can safely be assumed to refer to a CPU. The section I wrote reflects that point. -- uberpenguin 22:40, 19 December 2005 (UTC)[reply]

Oh, and just so you don't think I've been blowing smoke about this whole point:
  1. Template:Citepaper publisher - Paper describing the architecture of the TM3270 media processor. It's somewhat similar to a DSP/GPU, but is actually much closer architecturally to a CPU than GPUs are. The article never refers to the TM3270 as a CPU or a microprocessor, but as a "media processor" (actually, I think a very apt term for GPUs and CPUs).
That is interesting. The TM3270 looks like a general-purpose CPU with custom media extensions. This press release [16] refers to it as a CPU, but not as a microprocessor. MFNickster 01:05, 21 December 2005 (UTC)[reply]
  1. Template:Citepaper publisher - Paper specifically addressing general purpose programming on the latest generation of programmable GPUs (this was only published in October of this year). It refers to GPUs as "stream processors," never microprocessors. It even makes a very clear distinction between GPUs and CPUs (as, IMO, it should).
  2. Template:Citepaper publisher - Another paper talking in some detail about general purpose computation on GPUs. Always uses the terms "graphics coprocessor" or simply "graphics processor," never microprocessor.
I just found these resources in a few minutes of digging through late IEEE journals; and there are several more I haven't cited that talk extensively about GPU microarchitecture and always use terms like "graphics processor." True, omission is never sufficient to prove the point, but it does show (in a small way) that the trend by professionals in the field of digital microarchitecture is to refer to GPUs as what they are, and not as microprocessors. I do believe that this latter term is used much more commonly by lay men or those unexperienced in the field that don't already make the mental association of microprocessors with CPUs and simply decide that the term COULD apply to other things. -- uberpenguin 23:02, 19 December 2005 (UTC)[reply]
Perhaps. I didn't think you were "blowing smoke," just that you were only seeing part of the picture. I hope I've made fair case that such usage is more common in the industry than you have seen before. If you dig a little deeper, you'll find plenty of examples. MFNickster 06:29, 20 December 2005 (UTC)[reply]
Umm... You know you're really just beating this thing to death now; I agreed several edits back that it would be okay to mention GPUs and CPUs in the article, and I added a section addressing these myself. -- uberpenguin 19:22, 21 December 2005 (UTC)[reply]
Okay, I'll give it a rest! :) Having just done a bit of research, and would like to give future editors the benefit of that. MFNickster 19:39, 21 December 2005 (UTC)[reply]


I would like to add that prior to 1972 there were no formal definitions of microprocessor or microcomputer. The general term used was LSI (even for a processing unit). In 1972, Hank Smith, then Intel Marketing Manager gave this definition at a speech for the IEEE 1973 WESCON Professional Conference. He said "A CPU uses P-channel MOS and is contained in 1, 2, 3 or 4 LSI standard dual-in-line packages from 16 - 42 pins per package". This was as close as he could come to a definition and it was based on the current technology used by Intel. Of course, later technology and definitions changed. I think it is very important to put some attention on the phrase "Single Chip Microprocessor" as was called the Intel 4004 and 8008. It should be known that it took about 52 outside chips to make the 4004 work and about half that many to make the 8008 work. The F-14 MP944 chip set had no outside devices for the processor. In support of the technology in 1968 I offer this paper "LSI Technology State of the Art in 1968". Ray Holt 01:25, 04 April 2007 (UTC)

References

Here are some reference points for inclusion of a "specialized microprocessor" subsection. MFNickster 05:58, 18 December 2005 (UTC)[reply]

Corroboration within Wikipedia, for consistency:

NPU "Network Processing Unit or NPU is a CPU whose instructions are specialized to handle networking-related functions."
Microcontroller "A microcontroller is a computer-on-a-chip used to control electronic devices. It is a type of microprocessor emphasizing self-sufficiency and cost-effectiveness, in contrast to a general-purpose microprocessor, the kind used in a PC."
Graphics processing unit (old version) "A Graphics Processing Unit or GPU (also occasionally called Visual Processing Unit or VPU) is the microprocessor of a graphics card (or graphics accelerator) for a personal computer or game console"
Digital signal processor "A digital signal processor (DSP) is a specialized microprocessor designed specifically for digital signal processing, generally in real-time."

Books

  • From The Winn L. Rosch Hardware Bible, Third Edition:
"At heart, a [numeric] coprocessor is a microprocessor but unlike a general purpose microprocessor it is dedicated to its specific function as a special purpose device." (p. 151)
"Graphic coprocessors are full-fledged microprocessors that are designed primarily for carrying out graphic operations." (p. 622)
"A DSP need be nothing more than a microprocessor optimized for processing audio signals." (p. 789)
  • From "IA-32 Processor Architecture," Section 2.4.2 Video Output (p. 60) [17]
"The video controller is itself a special-purpose microprocessor, relieving the primary CPU of the job of controlling video hardware.
  • From Signal Processing Handbook, C.H. Chen, Ed., 1988:
"Advances in IC technology have made possible microprocessors of ever-increasing complexity whose architectures are tailored to DSP algorithms." (p. 193)
"This section will discuss the design of general-purpose digital signal processors. We will restrict our attention to microprocessors and use the term microprocessor and microcomputer interchangeably." (p. 197)
  • From Digital Signal Processing Implementations Using DSP Microprocessors, with Examples from TMS320C54xx, Avtar Singh & S. Srinivasan, 2004:
"A programmable digital signal processor is a microprocessor whose architecture is optimized to process sampled data at high rates." (p. 3)
  • From The Microprocessor: A Biography, Michael S. Malone, 1995:
"...we have basically restricted ourselves to the characteristics of microprocessors used in the central processing units of computers ... there are other kinds of microprocessors as well, most notably microcontrollers ... beyond the features they share with their central processing counterparts, also add another important function: digital signal processing." (p. 120)

Academic

  • Real-Time Computing For Human Computer Interfacing", Princeton University [18]
"A Digital Signal Processing chip (DSP) is a microprocessor designed specifically to implement Digital Signal Processing (DSP) algorithms."

Industry

  • Electronic Engineering Times [19]
Mapping computational concepts to GPUs, Mark Harris, Nvidia Corp.
"The computational speed on microprocessors is increasing faster than communication speed, especially on parallel processors such as GPUs."
  • NXP Semiconductor [20]
"The TriMedia processor, developed by Philips, is a special-purpose microprocessor for the real-time processing of audio, video, graphics and communications data streams."
  • Microsoft [21] "graphics coprocessor, n. - A specialized microprocessor, included in some video adapters, that can generate graphical images such as lines and filled areas in response to instructions from the CPU, freeing the CPU for other work."
  • Bluetooth Designer resource for engineers [22] "Digital Signal Processor: a microprocessor dedicated to real-time signal processing."
  • Apple IIgs Tech Note #11 [23] "The Ensoniq DOC in the Apple IIGS is actually a microprocessor dedicated to producing sound."

Redundance between articles

there's a similar list on Central processing unit. Do these need merging, or is one the parent article of the other? -- Tarquin 16:57 Jan 5, 2003 (UTC)

Abbreviation: µP

Is a microprocessor actually abbreviated μP? It certainly isn't an abbreviation in common usage, so if this is some specific jargon it should be labeled as such. --Delirium 04:51, Dec 12, 2003 (UTC)

I think it's an old habit, from the early days when most of the people using μPs (see, it just slipped out :-) ), were EEs, and used to saying μF for capacitors and the like. I see it in my old copies of Byte for instance (one of them also mentioned a North Star μdisc system, heh-heh). uP was a later concession to the limitations of ASCII. This all is worth noting, but as an older and informal usage, doesn't really need to be at the top. Stan 05:12, 12 Dec 2003 (UTC)
Also, µP and µC (microcontroller) are often used when quickly drawing embedded system concept sketches on a black/whiteboard or for that matter, on the proverbial napkin, so I felt that the abbreviation(s) should be very visibly included in the relevant articles (and made into associated #redirects). The general case, as Stan touches upon, is that µ and other Greek letters are much used in science/engineering environments to save space/time in written material. --Wernher 23:27, 12 Dec 2003 (UTC)

Leonardo's computer

Regarding the claim: There have even been designs for simple computing machines based on mechanical parts such as gears, shafts, levers, Tinkertoys, etc. Leonardo DaVinci made one such design, although none were possible to construct using the manufacturing techniques of the time. ... Does anyone know if the Leonardo DaVinci mechanical 'computer' or 'processor' claim is true? It's not mentioned in the Leonardo article, unless Leonardo's robot is considered a computing device. Reading up on the 'robot' does not sell me on the 'computing' possibility, though it is obviously an impressive contraption for the time. -- Ds13 03:31, 2004 Apr 15 (UTC)

Yes, mechanical computers have been designed and built. I suspect the original writer is thinking about the difference engine and analytical engine designed by Charles Babbage. --68.0.124.33 (talk) 02:16, 8 March 2008 (UTC)[reply]

No Intel?

Even though they dominate the desktop computers, there is almost no mention of the x86 family of processors at all in the history section after i386?

MIPS is not only used in embedded systems "like Cisco routers". The PlayStation game consolesare perhaps more well-known?

Remember Intel is mentioned quite a lot in the beginning. After that AMD is mostly mentioned because it gained a lead over Intel.
To this day AMD is still dominating Intel and if your doing a paragraph on modern microprocessors you should do it on the best,
AMD.
Yeah, sure. -- mattb @ 2007-04-07T03:00Z

I find it odd that the notable 32-bit section says the following: "The most famous of the 32-bit designs is the MC68000, introduced in 1979." The question here is, if the word famous is being used in the normal fashion, shouldn't the MOST famous 32-bit be a member of the x86 family? Regardless of how many applications there were of the 68k series, fame is a measure of popular knowledge. I'm not saying that the x86 family needs a boost in the article so much as that a word other than famous should be used to describe why the 68k series is more SIGNIFICANT than the x86, which I would argue it is. Jo7hs2 22:15, 2 November 2007 (UTC)[reply]

Contradiction: single versus multiple chips

The initial definition says a microprocessor is implemented on a single chip, which I have always understood to be an essential feature. However, further down the page there is mention of multi-chip 16-bit "microprocessors", which by this definition cannot exist. --Anonymous

You might have a valid point there, and I've always thought so myself. On the other side, cf. the definition at FOLDOC :
microprocessor <architecture> (Or "micro") A computer whose entire CPU is contained on one (or a small number of) integrated circuits.
Thus, many two or three-chip CPUs qualify as a µP, such as the RCA CDP1801 and Intel iAPX 432 (which, contrary to my general assumpton, I have always thought to be proper µPs). I think the essential part of the definition is the clause "or a small number of", which precludes CPUs made out of piles of TTL chips, but includes CPUs consisting of, say, 1--4 LSI chips. --Wernher 03:28, 9 Feb 2005 (UTC)
I think a lot of discussion about computing terminology can be resolved by looking in to the origins of the words. (Luckily this is easy with computer jargon which hasn't been around as long as other language) I don't have evidence to back this up, but I suspect that the term Microprocessor was originally marketing speak for the processors in the computers that came after mini computers, and therefore there isn't actually a rigorous technical definition.

Move History of Operating System support for 64 bit microchips

Is there any support for moving the section History of Operating System support for 64 bit microchips somewhere else, like maybe Operating Systems? It doesn't seem to serve much purpose here (other than a thinly veiled Linux good M$ bad dig)

Regarding the merging of the two articles

re: Suggestion to merge CPU and Microprocessor

I've done some preliminary work on this user subpage of mine. Work is still undergoing though and it's far from complete.

Anyone care to load it up, tell me what you think, maybe make a few changes to it? Leave any comments on my user talk. Thanks.

Also, we'll need to decide if we do go ahead with the merger which page to keep and which to change to a #redirect..

splintax 14:47, 8 September 2005 (UTC)[reply]

I personally oppose merging CPU with microprocessor. CPUs existed before ICs. Ancheta Wis 16:36, 11 September 2005 (UTC)[reply]
I agree with Ancheta Wis. Not only did CPUs exist before ICs, but to this day there are still some CPUs that are not microprocessors, though they are usually only found in rather specialized applications. Furthermore, it is now common for a microprocessor to contain more than just a CPU, e.g., memory controllers, simple peripherals. --Brouhaha 01:08, 12 September 2005 (UTC)[reply]
Okay then, I've abandoned the project. I didn't think it was all that great an idea myself but there seemed to be a fair bit of support for it around here. Perhaps the warning should be removeD?
In the spirit of "less talk, more work" I have taken upon myself the ambitious task of making the CPU article not suck. I hope that when I'm done you will see no need to merge the two articles. I believe that you will also be able to condense this article a bit. I'm not finished yet, but I've made some definite progress. I still need to write the largest section regarding CPU implementation (both historically and modern considerations, which will be brief since we have a pretty decent CPU design article). I encourage you all to look over my edits and offer any advice/help that you can. I'm also currently hunting down some good images to use in the article (I've located some, but I need the authors' consent to upload them). -- uberpenguin 23:31, 25 September 2005 (UTC)[reply]

Some parts of this article are also similar to integrated circuit.

NPOV

"As with many advances in technology, the microprocessor was an idea whose time had come." This isn't the right style of writing, and it isn't neutral. I haven't read much of the article, so there may be more. Also, a grammar error, right after the explanation of Moore's Law. I would normally do it, but... not now. Too tired, I'd mess it up.Twilight Realm 01:22, 27 September 2005 (UTC)[reply]

Difference from IC

This article, along with the articles for integrated circuit, CPU, etc, don't make it clear what's different about them. How exactly is a microprocessor different than an IC other than "microprocessors are very advanced integrated circuits"? Twilight Realm 01:16, 30 September 2005 (UTC)[reply]

Um.. Well you just said the difference... One is a specific and notable type of the other. Microprocessors ARE, in general, advanced and complex integrated circuits whose purpose is that of a CPU. -- uberpenguin 00:50, 29 September 2005 (UTC)[reply]

Are there any criteria defining microchips? Or is it just an objective term? The IC article says "For the first time it became possible to fabricate a CPU or even an entire microprocessor on a single integrated circuit." To me at least, that sounds like there's a specific level an IC must pass to be considered a microprocessor, or that there's a difference, that a microprocessor contains some components that an IC doesn't necessarily have to have. Clarification would be appreciated, even though it's not in this article. Twilight Realm 01:16, 30 September 2005 (UTC)[reply]

A microchip is really just an informal term for an integrated circuit, they are the same thing. "aka microchip", as it says in the article. I agree about the sentence quoted not being correct, I have changed it slightly. Alf Boggis (talk) 09:29, 30 September 2005 (UTC)[reply]
Be careful not to confuse microchips and microprocessors. Microprocessor almost ALWAYS means a CPU (Von Neumann-like) that is fabricated on one or more ICs. "Microchip" is usually just another term for IC. Something like a DSP/DAC could be fabricated on a microchip (or IC), but it would not likely be considered a microprocessor. -- uberpenguin 22:20, 1 October 2005 (UTC)[reply]

Interesting

How about this The article. 134.250.72.176

Apologies for no summery

Sorry that I didn't use an edit summery on my last edit, but I clicked on save page instead of minor edit. It was just reverting vandalism though. --Apyule 12:36, 2 November 2005 (UTC)[reply]

Removed "History of 64-bit support" section

This section is totally misplaced here. Not only does it have NOTHING to do with microprocessors, but it was TOTALLY wrong. Linux support for 64 bit microprocessors dates back to the Alpha and MIPS ports (LONG before x86-64). Windows support also dates back to NT 3's Alpha and MIPS R4xxx ports. Likewise, Mac OSX's blood relatives Darwin, Mach, and L4 all ran on 64-bit microprocessors before OSX was compiled for PowerPC64.

The section was REALLY 'history of OS support for x86-64,' which is already included in the AMD64 article (in much more complete form). -- uberpenguin 02:38, 18 December 2005 (UTC)[reply]

"In 64-bit computing, the DEC(-Intel) ALPHA, the AMD 64, and the HP-Intel Itanium are the most popular designs as of late 2004." Was that really true? And, if so, is it still true? Or is "popular" defined as something other than "most common"? I suspect there might be more 64-bit SPARC machines and 64-bit POWER/PowerPC machines (especially if you include AS/400 and iSeries PowerAS) than Alpha machines, much less Itanium machines. Guy Harris 19:06, 24 December 2005 (UTC)[reply]

Yeah, it's a spurious claim for sure. It would be far better to say that they are all popular designs (ALPHA isn't all-caps either...) -- uberpenguin 19:41, 24 December 2005 (UTC)[reply]



Hey didn't you forget the Intel Pentium D (Dual Core) processors. I feel the performance of Intel is way better than the DEC, AMD, others. 01/27/2005 posted by - swoosh

OK, so make it
In 64-bit computing, DEC Alpha, AMD64/EM64T, MIPS, SPARC, PowerPC/IBM POWER, and HP-Intel Itanium are all popular designs.
to include both AMD and Intel variations of that instruction set architecture. Guy Harris 01:50, 26 January 2006 (UTC)[reply]

I think there should be more links to processor architecture from this page. Von neuman, Harvard, DIB, etc. - swoosh

First commercial RISC architecture

I'm not sure, if MIPS was really the first here. ARM was made in (working!) silicon (ARM1) on April 261985, first products were sold 1986 (exact date missing, the "ARM Development System", a second processor card for the BBC Micro ), first workstations released June 1987 (Acorn Archimedes). But I don't know, when the first working MIPS silcon was made (I find 1985-1987 on the web, mips.com says nothing), what the first MIPS based products were, and when they were released. Some of the early products I know are the DECstation 2100 (1989), SGI Indigo (1990), MIPS Magnum 3000 (1990). Another candidate would be IBM ROMP, the first workstation was released 1986 (exact date missing), other products before that unlikely. - Alureiter 16:02, 7 February 2006 (UTC)[reply]

Moore's Law

WRT the reverting to #transistors doubling every 18 months: I initially thought that this was wrong, but on checking the article, even though 18 months is oft quoted, 24 seems to fit the data much better. Also, from Moore's law:

"In 1975, Moore projected a doubling only every two years. He is adamant that he himself never said "every 18 months", but that is how it has been quoted. The SEMATECH roadmap follows a 24 month cycle."

I think the best thing may be to change the 18 at the top of the Moore's Law article to 24, and re-revert the change here. Comments? --Mike Van Emmerik 22:42, 27 February 2006 (UTC)[reply]

I don't think that's a problem, as long as it is consistent with the other article. The "law" itself is not very rigid, as the article on it makes clear - the time period and the meaning of "complexity" can vary depending on which trends you look at. It might be enough to simply make note in this article that the complexity of integrated circuits and number of transistors on microprocessors have increased over time (while cost has stayed relatively flat) and simply link to the "Moore's Law" article. MFNickster 02:05, 28 February 2006 (UTC)[reply]
Whatever the actual statistics are, the forumlation of Moore's Law is that the number of transistors doubles every 18 months. So readers of this article are presented with a definition of the law that's directly contradicted by the article on Moore's Law. We should get this straight. --Mr random 20:20, 8 August 2006 (UTC)[reply]
You should have corrected the Moore's Law article instead - Moore said 2 years, not 18 months. The first line of that article is factually incorrect. MFNickster 22:34, 8 August 2006 (UTC)[reply]
I have correct both the Moore's Law entry and this one. The fact that the intro paragraph of the Moore's Law article contradicted the direct quotation of Moore that immediately followed, and referenced the interview it came from, and went against information that was established on the Talk page, yet still managed to endure for over a month is discouraging. — Aluvus t/c 02:20, 9 August 2006 (UTC)[reply]

Missing and bloated sections.

The first paragraph tells me what a microprocessor is made of but doesn't tell me what it does. I would like to see a succinct sentence about what a microprocessor actual does (execute instructions, for example), and then perhaps explain it a bit more in section farther down in the article.

Then at the end of the article there are three screens full of lists of various stuff. On Wikipedia it's easy to allow lists to get out of control and lose sight of what makes a thorough, balanced article. And complete doesn't mean we have to make a list of every possible internal and external link that might be somehow related!

So, tell me what the thing does and judiciously select a very few closely related links that might also be helpful. JonHarder 22:10, 16 July 2006 (UTC)[reply]

Well this again brings up the issue I raised months back (see the top of this talk page). Is a microprocessor necessarily a CPU? Can microprocessors be non-programmable? If that's the case, does a microprocessor have to execute "instructions" in the CPU sense? It's a big mess, just like the state of this article. Big issues about simple terminology have to be resolved before this article can see the sweeping changes it needs. -- uberpenguin @ 2006-07-16 23:01Z
I don't think it's too bad. The intro describes the function of the part (a microprocessor is a part), and links to the CPU article. If the readers read the CPU article as well, they'll have a better idea of what a microprocessor does. However, it does feel like there's a paragraph missing there - something that explains how microprocessors "made possible the advent of the microcomputer," a role previously filled by several parts. We should describe how they were able to do that. MFNickster 05:50, 17 July 2006 (UTC)[reply]

First or seccond?

From text:
The world's first single-chip 32-bit microprocessor was the AT&T Bell Labs BELLMAC-32A, with first samples in 1980, and general production in 1982(...)
but a few lines later:
The most famous of the 32-bit designs is the MC68000, introduced in 1979.
so, which one is right? it was the bellmac-3a or the mc68k?


Alejandro Matos 14:47, 20 November 2006 (UTC)[reply]

Depends whether you're measuring word size or address bus lines. The 68000 had a 32-bit word size but a 16-bit address bus, and it wasn't until 1984 that the 68020 was introduced with a 32-bit address bus. MFNickster 17:09, 20 November 2006 (UTC)[reply]
That turns out not to be the case. Look at the Motorola databooks for the 68000 but it had 24 address lines and 16 data lines. --Wtshymanski 18:50, 20 November 2006 (UTC)[reply]
Quite right. 24-bit address space, but the address registers were 32 bits wide. I was thinking 16-bit data bus, but my fingers typed "address bus." I think the way the article elaborates on this after the "MC68000 introduced in 1979" is enough to explain why the 68K was a 32-bit processor, though in some ways not a "full" 32-bit processor. MFNickster 02:24, 21 November 2006 (UTC)[reply]

I suggest a link to my site called 'How Computers Work: Processor and Main Memory' at http://www.fastchip.net/howcomputerswork/p1.html . It tells how a processor and memory work simply and in COMPLETE DETAIL. A microprocessor is a processor on a single chip. It is not to replace the 'How Stuff Works' link but compliment it. If you understand this book/site, you will understand PRECISELY what a microprocessor and its main parts are and how they work together. Thinkorrr 01:09, 4 December 2006 (UTC)[reply]

μP/μC Patent Disagreement

I just googled "Hyatt microprocessor" and found this. Apparently TI overturned the earlier patent on the grounds that it was never implemented at the time. --ArtifexCrastinus 06:57, 12 December 2006 (UTC)[reply]

Exhaustive Discussion (with references) of history of invention: Schaller PhD thesis chapter 7

...see http://home.comcast.net/~gordonepeterson2/schaller_dissertation_2004.pdf

 The paper can be found on the Computer History Museum web site: http://corphist.computerhistory.org/corphist/documents/doc-487ecec0af0da.pdf

The main article is missing, among other things, the Four Phase AL1 (of of several claims prior to Intel 4004). Schaller's discussion is even-handed and makes it clear that the history is complicated enough for it to be impossible to simply pick a "winner" as being "the first".

Schaller begins "CHAPTER 7: The Invention of the Microprocessor, Revisited" with an excellent selection of quotes from other cited sources:

"The 4004, invented by Intel, was the world's first commercially available microprocessor." (Intel website)1

"TI invents the single-chip microcomputer and receives the first patent for the single-chipmicroprocessor, ushering in the personal computer era." (Texas Instruments website)2

"The first microprocessor in a commercial product was Lee Boysel's AL1, which was designed and built at Four-Phase for use in a terminal application in 1969." (Nick Tredennick)3

"Alongside to the IC, the invention of the 'micro-processor' (MPU - Micro Processing Unit) is the greatest invention of the 20th century in the field of electronics." (Busicom Corp.)4

"[T]he idea of putting the computer on a chip was a fairly obvious thing to do. People had been talking about it in the literature for some time, it's just... I don't think at that point anybody realized that the technology had advanced to the point where if you made a simple enough processor, it was now feasible.~] (Ted Hoff)5

"Having been involved with integrated electronics when I was at Intel, we never conceived of patenting a computer on a chip or CPU on a chip, because the idea was patently obvious. That is you worked on a processor with 25 chips, then 8 chips, and by- God eventually you get one chip so where's 'the invention'." (Stan Mazor)6

Such inventions don't come from new scientific principles but from the synthesis of existing principles... Because these inventions have a certain inevitability about them, the real contribution lies in making them work. (Federico Faggin)7

[A]t the time in the early 1970s, late 1960s, the industry was ripe for the invention of the microprocessor. With the industry being ready for it, I think the microprocessor would have been born in 1971 or 1972, just because the technology and the processing capability were there. (Hal Feeney)8

"I don't think anyone 'invented' the microprocessor. Having lived through it, this [claim] sounds so silly." (Victor Poor)9

"It is problematic to call the microprocessor an 'invention' when every invention rides on the shoulders of past inventions." (Ted Hoff)10

"Most of us who have studied the question of the origin of the microprocessor have concluded that it was simply an idea whose time had come. Throughout the 1960's there was an increasing count of the number of transistors that could be fabricated on one substrate, and were several programs in existence, both commercial and government funded, to fabricate increasingly complex systems in a monolithic fashion. (Robert McClure)11

The question of 'who invented the microprocessor?' is, in fact, a meaningless one in any non-legal sense. The microprocessor is not really an invention at all; it is an evolutionary development, combining functions previously implemented on separate devices into one chip. Furthermore, no one individual was responsible for coming up with this idea or making it practical. There were multiple, concurrent efforts at several companies, and each was a team effort that relied on the contributions of several people.? (Microprocessor Report)12

"The emergence of microprocessors is not due to foresight, astute design or advanced planning. It has been accidental." (Rodnay Zaks)13

"The only thing that was significant about the microprocessor was that it was cheap! People now miss this point entirely." (Stan Mazor)14

1

"Intel Consumer Desktop PC Microprocessor History Timeline,"

http://www.intel.com/pressroom/archive/backgrnd/30thann_timeline.pdf 2

"History of Innovation: 1970s," http://www.ti.com/corp/docs/company/history/1970s.shtml

3

Nick Tredennick, "Technology and Business: Forces Driving Microprocessor Evolution," Proceedings of the

IEEE, Vol. 83, No. 12, December 1995, 1647. 4

"Innovation: The World's first MPU 4004," http://www.dotpoint.com/xnumber/agreement0.htm

5

Ted Hoff as quoted in Rob Walker, "Silicon Genesis: Oral Histories of Semiconductor Industry Pioneers, Interview with Marcian (Ted) Hoff, Los Altos Hills, California" Stanford University, March 3, 1995.

6

Stan Mazor, Stanford University Online Lecture, May 15, 2002, 020515-ee380-100,

http://www.stanford.edu/class/ee380/ 7

Federico Faggin, "The Birth Of The Microprocessor: An invention of major social and technological impact

reaches its twentieth birthday," Byte, Volume 2, 1992, 145, http://www.uib.es/c- calculo/scimgs/fc/tc1/html/MicroProcBirth.html 8

"Microprocessor pioneers reminisce: looking back on the world of 16-pin, 2000-transistor microprocessors,"

Microprocessor Report, Vol. 5, No. 24, December 26, 1991, 13(6). Hal Feeney helped design the 8008 at Intel. 9

Vic Poor, former vice president of research R&D for Datapoint, telephone interview with the author, June 5,

2003. 10

Dean Takahashi, "Yet Another 'Father' of the Microprocessor Wants Recognition From the Chip Industry,"

Wall Street Journal, September 22, 1998, http://www.microcomputerhistory.com/f14wsj1.htm 11

See e-mail/newsgroup posting to Dave Farber's IP list dated May 12, 2002 to Dave Farber

dave@farber.net McClure was formerly with TI and helped found CTC; he also was an expert witness in the Boone patent case. 12

Microprocessor Report, op. cit.

13

Rodnay Zaks, Microprocessors: from chips to systems, 3/e, SYBEX Inc., 1980, First Edition Published

1977, 29. 14

Stan Mazor, telephone interview with the author, June 10, 2003.

It's a rich source of information for enhancing the main article (and quite interesting reading for its own sake)

Dougmerritt 04:32, 23 January 2007 (UTC)[reply]

First 16bit single chip processor contradiction- "National introduced the first 16-bit single-chip microprocessor, the National Semiconductor PACE..." and then a paragraph or so later, "The first single-chip 16-bit microprocessor was TI's TMS 9900..."

Dead Link: History of general purpose CPU

There is no such article. If someone removed it, please provide a substitute. If not, please remove the link. Landroo 13:31, 1 September 2007 (UTC)[reply]

Fixed. MFNickster 00:55, 3 November 2007 (UTC)[reply]

Wayne D. Pickette

Look at these articles everyone!

http://www.indybay.org/newsitems/2004/12/08/17088681.php

http://www.thocp.net/biographies/pickette_wayne.html


Its about the real brains and the actual "father" of the microprocessor. How come he isn't included in this article? And there isn't a single mention of him in Wikipedia either! His name doesn't appear anywhere as far as I've seen! Seriously this is one great guy screwed by Intel, Fairchild etc. big time!

And this is to the moderator(s): kindly dont hide what I've just (with a * or whatever). Certain stuff needs to be spoken out loud!

Hope he gets the credit due to him soon!


Krishvanth (talk) 06:50, 5 January 2008 (UTC)[reply]

Funny

It's funny how this article explains jack shit about how microprocessors work. The most simple thing this article should have is somehow nonexistant. —Preceding unsigned comment added by 137.28.55.114 (talk) 21:55, 31 January 2008 (UTC)[reply]

See Central processing unit#CPU operation. --Wtshymanski (talk) 15:38, 18 September 2008 (UTC)[reply]

Military use in F-14 Tomcat

In 1968, Garrett AiResearch, with designer Ray Holt and Steve Geller, were invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy's new F-14 Tomcat fighter.

The processor was used for the flight control computer, or for the Fire Control System (FCS)? Because as far as I know, the Tomcat didn't have a fly-by-wire control system. Maybe the author meant the FCS but got confused? —Preceding unsigned comment added by 79.107.73.166 (talk) 05:21, 28 October 2008 (UTC)[reply]

History of microprocessors

Two concerns: Firstly, only the history of 64-bit microprocessors for personal computers is covered. Secondly, coverage of multicore microprocessors is limited to "mass-market" designs. Is there any reason why the coverage should not be extended to all microprocessors? I am willing to make an effort. Rilak (talk) 04:23, 24 February 2009 (UTC)[reply]

It would appear to me that "64-bit designs for personal computers" is actually the shortest history section, although it is in some sense the culmination of some earlier threads. And then of course there is History of general purpose CPUs. Certainly there are parts of the article that could use work, but I'm not sure that I understand your particular concern. — Aluvus t/c 05:26, 24 February 2009 (UTC)[reply]
But History of general purpose CPUs does not cover 64-bit microprocessors. It does not seem to present the history organized in n bits. It doesn't cover multicore microprocessors either, it makes a mention of them, but no dates or examples are given. This article could link to 64-bit and Multicore, respectively, where there are more complete histories, but then, what is the point of the selectively covering certain parts of history? Why have a section titled "64-bit designs in personal computers"? All other sections do not have this restriction and 64-bit microprocessors have been around in computers and consumer electronics such as game consoles for some time, for example. Perhaps brief summaries of the important events and a link to each of the subject's articles should replace the present sections? Rilak (talk) 06:07, 24 February 2009 (UTC)[reply]

added citations

added citations to the first types sectionMatsuiny2004 (talk) 21:58, 18 April 2009 (UTC)[reply]

can somebody do some more research on the TMS 1000 since the source I have used considers it a microcontroller. If this is so then should it not be moved to the micro controllers article?Matsuiny2004 (talk) 22:37, 18 April 2009 (UTC)[reply]

added section

added a small section on history of general purpose microprocessorsMatsuiny2004 (talk) 22:11, 18 April 2009 (UTC)[reply]

Basic block diagram

It's been decades since I've been that deep into the matter, but we used to have these simple block diagrams of the essential components of a microprocessor. If s.o. knows what I'm talking about and still has one or can find one it would be nice if we could add something like that to this page. 71.236.24.129 (talk) 09:59, 13 May 2009 (UTC)[reply]

Early History

Datapoint never used the 8008 or 8080 altho they did play a role in their creation. They were too slow. The only unit I recall that used a single-chip "microprocessor" was their 15xx series which used the Z80. more info here: http://www.old-computers.com/museum/doc.asp?c=596

and more: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9111341

Ken (talk) 15:40, 26 May 2009 (UTC)[reply]

I have changed the appropriate text in the main article to reflect this.Ken (talk) 02:31, 5 June 2009 (UTC)[reply]

Mfgs and markets

So, I'm in a minor edit war with what I assume to be the same anonymous contributor (IP address varies, but writing style and method is the same -- you may want to register an account to make things clearer, or at least provide a handle in the edit summary). I keep removing a giant list of manufacturers, and the other contributor keeps putting it back in, with an edit summary that implies they are concerned that the article gives the impression that the microprocessors used in general-purpose PCs are the only applications.

I find that's a reasonable concern. If you want to make sure it is understood that microprocessors are used both in general-purpose and embedded designs, by all means, do so. But please do so in prose, by discussing applications in both GP PCs and embedded systems. Ideally, cite market-share figures in reliable sources, for both applications. It would be nice to know what the percentages are. (Be aware that we currently draw a distinction between microprocessor and microcontroller. Perhaps both articles should be clarified.)

However, I must insist that dumping a huge list of manufacturers into the article is the wrong thing to do. This is purely an editorial/style objection. Lists belong in the list pages we already have. They should not be duplicated here.

Thanks. —DragonHawk (talk|hist) 17:59, 26 December 2009 (UTC)[reply]

Voyager didn't have an 1802 µprocessor on board..

Well, at least according to the RCA 1802 article it didn't. —Preceding unsigned comment added by Stib (talkcontribs) 23:44, 25 May 2010 (UTC)[reply]

Transputer - not worth a mention at all

http://en.wikipedia.org/wiki/Transputer

Skyshack (talk) 17:45, 1 April 2011 (UTC)[reply]

So? Why should this article mention the Transputer family? Tell us more. --Wtshymanski (talk) 18:35, 1 April 2011 (UTC)[reply]
I agree that it deserves at least a passing mention. Can't say why exactly, other than my vague opinion "it is a significant part of microprocessor history", especially concerning parallelism -- Nczempin (talk) 22:20, 1 April 2011 (UTC)[reply]
Having failed to delete all of WP's articles on transistors, Wtshymanski is presumably shifting his attention to microprocessors.
We should probably mention Viper too. Even though it was a failure, it was a notable attempt in one particular direction. Andy Dingley (talk) 22:34, 1 April 2011 (UTC)[reply]
Not sure if the potshot was necessary. Wtshymanski asked in perfectly neutral tone why the OP thought Transputer should be included. Given that at least one other person (yours truly) cannot easily deliver a more convincing argument than his opinion, the question seems entirely appropriate. Viper would IMHO have to demonstrate a lot more importance in its own article before it could be considered here. Perhaps you want to start another thread. It does make Transputer look good in comparison, not sure if that was the intent :-). -- Nczempin (talk) 22:53, 1 April 2011 (UTC)[reply]
Wtshymanski has spent the last week parroting the view that anything electronic with a part number is inherently non-notable. Stuff that. Andy Dingley (talk) 22:57, 1 April 2011 (UTC)[reply]

Getting back to the topic of this article, was the Transputer as big a deal as it seemed at the time? It got a lot of press but seems to have faded away as "regular" microprocessors caught up; I wonder why the Transputer didn't keep its lead over more complicated processors. --Wtshymanski (talk) 23:37, 1 April 2011 (UTC)[reply]

Well, first of all, it was (and the article is) somewhat disadvantaged as a British design in an American-dominated world (this by itself is of course no justification for its inclusion, just a note why it seems to be overlooked quite a bit). Secondly, it was probably a great deal ahead of its time; only with current developments after the GHz limit has been pretty much reached are we starting to "re-discover" parallelism. I wouldn't want a whole page on the transputer in the mP article, but a sentence or two wouldn't be wrong. Perhaps it would make sense to start by giving it a little more space in the more specialized articles on parallel computing. -- Nczempin (talk)
Why, are you planning to delete this too? Andy Dingley (talk) 00:19, 2 April 2011 (UTC)[reply]
No, I'm editing encyclopedia articles. You can, too.. My rather elderly IEEE computer encyclopedia has a disappointing article "transputer" which tells me nothing about why they didn't catch on. The multiple communication ports in the processor seems not to have endured to other designs (though someone much more hip to the current Intel designs might shed some light here). Parallelism doesn't seem to get applied except for graphics and number-crunching. --Wtshymanski (talk) 14:58, 2 April 2011 (UTC)[reply]

Info

I think that this page doesn't offer enough information, such as how they function, how the transistors work, what type of transistors there are, such as MOSFETS. Then again, no a lot of people need to learn all that. — Preceding unsigned comment added by Patrick-liu11 (talkcontribs) 19:14, 3 April 2011 (UTC)[reply]

We have more than one page in Wikipedia: How they work: CPU, CPU design. What microprocessors are made out of: Logic gate. Transistors including MOSFETs. If you feel that those articles are deficient in some way, go ahead and help improve them! Note however that Wikipedia is not a textbook. Perhaps Wikibooks or Wikiversity would be more appropriate. The pages on those sites can also improved by you. -- Nczempin (talk) 21:16, 3 April 2011 (UTC)[reply]

TMS 1000

This section implies that in the opinion of the Smithsonian staff the TMS 1000 was the first microprocessor. In fact, the link is to a page from a book that the Smithsonian has scanned in called STATE OF THE ART. The bottom of the page says "The National Museum of American History and the Smithsonian Institution make no claims as to the accuracy or completeness of this work." The information in this section was discredited in connection with litigation in the 1990s, when Texas Instruments claimed to have patented the microprocessor. In response, Lee Boysel assembled a system in which a single 8-bit AL1 was used as part of a courtroom demonstration computer system, together with ROM, RAM and an input-output device. See the Wikipedia article on Four Phase Systems: http://en.wikipedia.org/wiki/Four_Phase_Systems_AL1 — Preceding unsigned comment added by GilCarrick (talkcontribs) 16:43, 8 June 2011 (UTC)[reply]