Talk:Central Air Data Computer

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.
 
WikiProject Aviation / Aircraft (Rated Start-class)
WikiProject icon This article is within the scope of the Aviation WikiProject. If you would like to participate, please visit the project page, where you can join the project and see lists of open tasks and task forces. To use this banner, please see the full instructions.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
Checklist icon
 
Note icon
This article has been selected for use on the Aviation Portal.
 
Taskforce icon
This article is supported by the aircraft project.

Multichip or not multichip?[edit]

The F14 CADC was NOT multichip. This is a cannard put out by the pro-Intel forces to attempt to discredit the CADC as being a microprocessor developed before the 4004. The only thing the CADC did NOT have that the Intel 4004 does is a program counter. The program counter in the CADC architecture was placed on the RAS (Random Access Storage, otherwise known as RAM) and ROM chips to facilitate multi-tasking/processing. A PC could have easily been implemented on the primary processor but this was a pratical design decision only.

Read more about the CADC at the CADC website. --Ray Holt

This is all extremely interesting. The reason I myself haven't looked more closely into whether the CADC was a single- or multi-chip µP is quite simply the website's overall emphasis on the word "chip set", implying more than one chip to make out the CPU. (However, now that I think about it, the 4004 was also an integral part of a chipset---the MCS-4---i.e. RAMs, ROMs, I/O circuits, etc, without which the CPU would be pretty useless.)
After reading your comment above and browsing Holt's 1971 paper I'm still not 100% sure I completely understand how the CADC system would make up a CPU by combining a single unit of the chipset with a ROM (and optionally a RAS). Therefore, could you please specify what you mean by the "primary processor"? Is it one of the PMU/PDU/SLF units, or one of the other units mentioned?
As for the "first µP" question, IMO two points of consideration are:
  1. The CADC is clearly a much more capable and in fact scalable µP system---more of "a real computer"---than the 4004 & Co
  2. The 4004 may arguably still be reckoned as the first single chip µP if it is the only CPU fitting the "official definition"(?) of a single chip µP
But another way to look at all this would perhaps be to consider how many chips one actually needs to build a simple computer: the 4004 would also at least need a ROM chip, as the CADC does, to hold its instructions. A RAS/RAM is not essential, of course, since the computer in question might just be doing some simple I/O and calculations fitting within its on-chip registers. Please comment, folks. --Wernher 04:42, 23 November 2005 (UTC)


The F-14 CADC had one main chip called the SLF that acted as the CPU. All other chips were RAM, ROM and support. The entire F-14 computer was made up of combinations of 6 chip types. The Intel 4004 also was a set but it also required 59 TTL circuits around it to work. As for the 4004 being arguable the first 'single chip' fitting the 'official definition.' their was no definition at the time the 4004 was designed. All definitions came later in trying to fit the 4004 as being first. Five years after the F-14 CADC design Intel was still grappling with the definition, 1973 from Hank Smith, Microprocessor Marketing Manager, Intel Corp.

1973 from Hank Smith, Microprocessor Marketing Manager, Intel Corp. IEEE 1973 WESCON Professional Program Session 11 Proceedings “A CPU uses P-channel MOS and is contained in 1, 2, 3 or 4 LSI standard dual-in-line packages from 16 – 42 pins per package”. -- [Ray Holt]

Four-Phase Systems AL1[edit]

There is also the Four Phase Systems AL8, which preceded the CADC by a couple years, and may well have also been a single-chip 8-bit microprocessor, but it was intended to be utilized as an 8-bit slice of a 24-bit architecture. --Anonymous

Perhaps you mean the AL1? It almost seems to be a "chip lost to history". :-) Google got me one barely "usable" hit on it, The 8008 and the AL1, which is nothing more than a mailing list excerpt from the "Interesting People Elist". I have written to one of the participants, Nick Tredennick, asking him if he could help us get some more docs on the AL1. --Wernher 06:03, 23 November 2005 (UTC)

Sources[edit]

This article has serious problems with sources. I found three of the different URLs cited were actually verbatim copies of the same page. That page appears to be self-published by Ray Holt himself. While I certainly respect Mr. Holt's achievements, that's got nothing to do with the requirements for acceptable sources for this encyclopedia. Further, two of the citations were for specific quotes, and the quotes were not actually present in the text given. I don't know if this is due to link rot or what. Finally, the one remaining source is either moved or deleted, and the site restricts archiving so we can't get it back at that URL. In short, my attempts to verify the information in this article actually made things worse. If any of the original authors are watching, could you please see about clearing up your sources? I'll help with citation formats and such if needed. Thanks. —DragonHawk (talk|hist) 21:25, 30 November 2009 (UTC)

I made a correction about the 20-bit ADC before reading the web page, which says "Two state-of-the-art quartz sensors, a 20-bit high precision analog-to-digital converter, a 20-bit high precision digital-to-analog converter, the MOS-LSI chip set, and a very efficient power unit made up the complete CADC." I still don't believe it is possible, or likely, that there was a 20-bit ADC involved; certainly Ray's PPT slide show doesn't claim any such. A 20-bit ADC would be incredible overkill, and would have been nearly impossible to make in that era; it would make no sense to use only 20 bits in the digital part if the ADC was better than about 14 bits, which would have been state-of-the-art at the time. Dicklyon (talk) 19:25, 26 August 2011 (UTC)

The ADC and DAC in the CADC were in fact 20-bits. This resolution was determined by the applied mathematician on the project. 14-bit ADC and DAC was state-of-the-art at the time, however, Garrett AiResearch engineers (i.e. Tom Redfern and group) did design a reliable ADC and DAC for this project. As far as sources I have published much of the documentation on my website firstmicroprocessor.com and I have original documentation (design notebook and CADC technical manual) in my possession. [Ray Holt]