History of computing: Difference between revisions
DeadEyeArrow (talk | contribs) m Reverted 1 edit by 66.244.215.210 identified as vandalism to last revision by ClueBot. TW |
revert to what I think is the last good version |
||
Line 2: | Line 2: | ||
The '''history of [[computing]]''' is longer than the [[history of computing hardware]] and [[computer|modern computing technology]] and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The '''[[timeline of computing]]''' presents a summary list of major developments in computing by date. |
The '''history of [[computing]]''' is longer than the [[history of computing hardware]] and [[computer|modern computing technology]] and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The '''[[timeline of computing]]''' presents a summary list of major developments in computing by date. |
||
==Concrete devices== |
==Concrete devices== |
||
Computing is intimately tied to the representation of '' |
Computing is intimately tied to the representation of ''numbers''. But long before [[abstraction]]s like ''number'' arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as : |
||
*''[[one-to-one correspondence]]'', a rule to [[counting|count]] ''how many'' items, say on a [[tally stick]], which was eventually abstracted into ''number''; |
*''[[one-to-one correspondence]]'', a rule to [[counting|count]] ''how many'' items, say on a [[tally stick]], which was eventually abstracted into ''number''; |
||
*''comparison to a standard'', a method for assuming ''[[reproducibility]]'' in a [[measurement]], for example, the number of [[coin]]s; |
*''comparison to a standard'', a method for assuming ''[[reproducibility]]'' in a [[measurement]], for example, the number of [[coin]]s; |
Revision as of 03:47, 13 September 2007
History of computing |
---|
Hardware |
Software |
Computer science |
Modern concepts |
By country |
Timeline of computing |
Glossary of computer science |
The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The timeline of computing presents a summary list of major developments in computing by date.
Concrete devices
Computing is intimately tied to the representation of numbers. But long before abstractions like number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as :
- one-to-one correspondence, a rule to count how many items, say on a tally stick, which was eventually abstracted into number;
- comparison to a standard, a method for assuming reproducibility in a measurement, for example, the number of coins;
- the 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly spaced knots, for example.
Numbers
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach sequences to others. All the known languages have words for at least "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items. [citation needed]
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu-Arabic numeral system had reached Europe, which allowed for systematic computation of numbers. During this period, the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm (for use in multiplication and division) and the trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation. Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to learn the answer.
Navigation and astronomy
Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed by looking up numbers in a mathematical table, and interpolating between known cases. For small enough differences, this linear operation was accurate enough for use in navigation and astronomy in the Age of Exploration. The uses of interpolation have thrived in the past 500 years: by the twentieth century Leslie Comrie and W.J. Eckert systematized the use of interpolation in tables of numbers for punch card calculation.
In our time, even a student can simulate the motion of the planets, an N-body differential equation, using the concepts of numerical approximation, a feat which even Isaac Newton could admire, given his struggles with the motion of the Moon.
Weather prediction
The numerical solution of differential equations, notably the Navier-Stokes equations was an important stimulus to computing, with Lewis Fry Richardson's numerical approach to solving differential equations. To this day, some of the most powerful computer systems of the Earth are used for weather forecasts.
Symbolic computations
By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to pass college-level calculus courses. Using programs like Maple, Macsyma (now Maxima) and Mathematica, including some open source programs like Yacas, it is now possible to visualize concepts such as modular forms which were only accessible to the mathematical imagination before this.
See also
- History of computing hardware, Timeline of quantum computing
- History of computer science
- Algorithm
- List of mathematicians
- Computing timelines category
- Virtual Museum of Computing
- The Snoopy Calendar program is the classic Fortran program referenced in many nostalgic items on computer history (eg. 'Real Programmers Don't Use Pascal').
- History of free software
External links
- IEEE Annals of the History of Computing
- Richmond (UK) History of Computing Group
- The History of Computing by J.A.N. Lee
- The History of Computing Project
- SIG on Computers, Information and Society of the Society for the History of Technology
- A History of Computers
- Stanford Encyclopedia of Philosophy entry
- The history of computer
- Charles Babbage Institute: Center for the History of Information Technology
- Key Resources in the History of Computing
- Cringely's "Triumph of the Nerds"
- A Chronology of Digital Computing Machines (to 1952) by Mark Brader
- Bitsavers, an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 50s, 60s, 70s, and 80s
- Soviet calculators and computers collection by Sergei Frolov
- emula3.com Emulation plus historic documents and images (see Library and Gallery sections)
- Cyberhistory (2002) by Keith Falloon. UWA digital thesis repository.
Computer History Museums
- Computer History Museum
- German Museum of computer technology with still executable devices
- The Virtual Museum of Early Computers