Tyranny of numbers

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Cantons-de-l'Est (talk | contribs) at 15:53, 29 June 2010 (Interwiki(s).). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The tyranny of numbers was a problem faced in the 1960s by computer engineers. Engineers were unable to increase the performance of their designs due to the huge number of components involved. In theory, every component needed to be wired to every other one, and were typically strung and soldered by hand. In order to improve performance, more components would be needed, and it seemed that future designs would consist almost entirely of wiring.

The term was first used by the Vice President of Bell Labs in 1957 in a paper celebrating the 10th anniversary of the invention of the transistor. Referring to the problems many designers were having, he stated:

For some time now, electronic man has known how 'in principle' to extend greatly his visual, tactile, and mental abilities through the digital transmission and processing of all kinds of information. However, all these functions suffer from what has been called 'the tyranny of numbers.' Such systems, because of their complex digital nature, require hundreds, thousands, and sometimes tens of thousands of electron devices.

— Jack Morton, The Tyranny of Numbers

At the time, computers were typically built up from a series of "modules", each module containing the electronics needed to perform a single function. A complex circuit like an adder would generally require several modules working in concert. The modules were typically built on printed circuit boards of a standardized size, with a connector on one edge that allowed them to be plugged into the power and signaling lines of the machine, and were then wired to other modules using twisted pair or coaxial cable.

Since each module was relatively custom, modules were assembled and soldered by hand or with limited automation. As a result, they suffered major reliability problems. Even a single bad component or solder joint could render the entire module inoperative. Even with properly working modules, the mass of wiring connecting them together was another source of construction and reliability problems. As computers grew in complexity, and the number of modules increased, the complexity of making a machine actually work grew more and more difficult. This was the "tyranny of numbers".

It was precisely this problem that Jack Kilby was thinking about while working at Texas Instruments. Theorizing that germanium could be used to make all common electronic components - resistors, capacitors, etc. - he set about building a single-slab component that combined the functionality of an entire module. Although successful in this goal, it was Robert Noyce's silicon version and the associated fabrication techniques that make the integrated circuit (IC) truly practical.

Unlike modules, IC's were built using photoetching techniques on an assembly line, greatly reducing their cost. Although any given IC might have the same chance of working or not working as a module, they cost so little that if they didn't work you simply threw it away and tried another. In fact, early IC assembly lines had failure rates around 90% or greater, which kept their prices high. The U.S Air Force and NASA were major purchasers of early IC's, where their small size and light weight overcame any cost issues.

IC's from the early 1960s were not complex enough for general computer use, but as the complexity increased through the 1960s, practically all computers switched to IC-based designs. The result was what are today referred to as the third-generation computers, which became commonplace during the early 1970s. The progeny of the integrated circuit, the microprocessor, eventually superseded the use of individual IC's as well, placing the entire collection of modules onto one chip.

Seymour Cray was particularly well known for making complex designs work in spite of the tyranny of numbers. His attention to detail and ability to fund several attempts at a working design if need be meant that pure engineering effort could overcome the problems they faced. Yet even Cray eventually succumbed to the problem during the CDC 8600 project, which eventually led to him leaving Control Data.

References