Talk:Analog computer

From Wikipedia, the free encyclopedia
Jump to: navigation, search
          This article is of interest to the following WikiProjects:
WikiProject Systems  
WikiProject icon This article is within the scope of WikiProject Systems, which collaborates on articles related to systems and systems science.
 ???  This article has not yet received a rating on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.
Taskforce icon
This article is not associated with a particular field. Fields are listed on the template page.
 
WikiProject Computing / Early / Hardware (Rated B-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
Taskforce icon
This article is supported by Early computers task force.
Taskforce icon
This article is supported by Computer hardware task force (marked as Mid-importance).
 
WikiProject Technology (Rated B-class)
WikiProject icon This article is within the scope of WikiProject Technology, a collaborative effort to improve the coverage of technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
Checklist icon
 
WikiProject Mathematics (Rated B-class, Mid-importance)
WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
B Class
Mid Importance
 Field: Applied mathematics

What it was like using these systems[edit]

An interesting article and interesting comments. I programmed my first analog computer in 1959 for reactor research. We should realise that back then it was not at all clear which direction the DP (not IT!) industry would go. At DP exhibitions you saw as much reference to analog as digital. With the benefit of hindsight we can see that analog was never going to be the 'leader' but its ability to literally PLUG in a whole logical SET of algorithms and connect them through a large (very heavy!) plug board - of which I still have one - made it a remarkable tool? An analog (we used 3 PACE systems) could model the physical world far more quickly and (possibly) with more accuracy than the digital machines of the time. Our research division used the largest digital systems to cross verify the work done by each technique! It was a TRUE Analog vs Digital battle!! In passing... I programmed my first computer in 1948 and would like to catch up with anyone who did work on these machines in that period?? But how the 'world' has changed!! tn June 2011 — Preceding unsigned comment added by 81.136.222.6 (talk) 01:20, 4 June 2011 (UTC)

There needs to be a discussion or at least a link(Wiki,etc.) Charles Babbage(see wiki article)re: good design, if unsuccesful build(c.1850) of analog computer.65.222.113.35 (talk) 16:14, 9 November 2010 (UTC)

Babbage's designs were digital, not analogue. Man with two legs (talk) 16:22, 9 November 2010 (UTC)

abzorba 07:49, 26 September 2006 (UTC)[editorial commentary moved from main page:} A computer is defined by a turing machine. The turing machine must have a stack. The slide rule has nothing which can be used as a stack. Furthermore the markings on a slide rule are discrete, not analog, making it digital. However, it would be nice to have a replacement here...!!

There are many definitions of 'computer' which do not involve Turing machines.. and the markings are only guidelines; one can still estimate infinite gradations between the markings.

Hello. This is an interesting article. There is an assertation towards the end that analog computers are used in earthquake prediction, and that they're superior to digital computers for that purpose. That is an interesting assertion. I wonder if someone has some references about that. I searched the web but all I found were two refs (no. 166 and 167) at [1]; these two date from the 1940's. Thanks for any info, Wile E. Heresiarch 01:24, 20 Mar 2004 (UTC)

I put that in there and admittedly I don't recall where I learned it, but I am certain that it is true. I am quite certain that there are also many other applications in which a specialized analog computer is superior to a digital computer.

I don't however, find this interesting. I find it interesting, rather, that it is in general found interesting. After all, a digital computer is just a special type of analog computer, and it has been known for a long time that hybrid computers are far more powerfull than digital computers, at least for sceintific problems. (Try your local library.)

In any case, I assure you that the assertion can be confirmed. -- Kevin Baas 03:21, 20 Mar 2004 (UTC)

The article gives an indication to why this would be true though. Computation on an analog computer is much more dynamic than on a digital computer - computation is almost instantaneous on an analog computer, whist a digital computer must repeatedly recalculate things on changes in the calculation. Dysprosia 03:27, 20 Mar 2004 (UTC)

How can I remove the table of contents? 666 20:23, 28 Mar 2004 (UTC)

You shouldn't. If you don't like it showing up, click [hide] on the TOC. Dysprosia 22:34, 28 Mar 2004 (UTC)

Analogy[edit]

I think this article should differentiate between the concept of using physical analogies for computing and their various implementations.


It would be nice if this subjective nonsense about springs be replaced by a less bulky and more straighforward explanation like "analog computers use the information encoded in the entire voltage range of a signal, as opposed to digital computers, which only use signals which are either logic high or logic low."

I wonder if quantum computers belong here in the comparison of differennces, or if there should be a separate article that tries to explain the significant differences between

Computing elements[edit]

Concerning the computing elements of electronic analog computers: The list given in the article is not correct:

All active computing elements (apart from multipliers) change the sign, so summing terms will automatically change the sign of the result, inverting should be removed from this general list (some analog computers had special summers with only one input called "inverters", but these are summers nevertheless).

Exponentiation and log-calculation are no basic operations for an analog computer. When these functions are needed they are normally implemented using a diode based function generator which approximates the exp-/log-function by polygons which is quite erroneous so these operations are avioded whereever possible! Sometimes exponentiation can be performed by integration (which, by the way, changes the sign of the result, too).

The multiplication is missing in the basic operations - every practical analog computer has multipliers, normal were servo multipliers, time division multipliers and parabola multipliers. Some more arcane schemes used special electron tubes like the "Hyperbelfeld"-Tube developed at the University of Technology Darmstadt.

Normally differentiation is a no no in analog computing since deriving a function will increase the noise! Normally the differential equations to be solved will be integrated so many times that all derivatives will vanish. This results in low noise and makes differentiation normally unnecessary. There are trick circuits to convert an integrator into a differentiator by using an open summer (inverted function).

That's all - CU - Bernd. 213.139.158.197 17:16, 19 December 2005 (UTC) Bernd Ulmann, ulmann@vaxman.de, http://www.vaxman.de/analog_computing/analog_computing.html


There's a quite straightforward way to multiply with electrical-only circuits. All you need is a couple of quarter-squarers, then use the relationship ab = ((a+b)/2)**2 - ((a-b)/2)**2. Or you can use the Hall effect for direct multiplication. PML.

Believe me, my electronics is pretty rusty. I assume what you say is true, so I removed the electro-mechanical reference, but then added multiplication and division to the list of operations. normxxx 12:55, 31 December 2005 (UTC)


From the article: Analog systems are understood only as continuous, time variant electrical systems. From the above discussion, it should be obvious that this is not correct.

Could someone expand on this passage? I'm not sure it's obvious to the layman.

I am sure it's not. But I'm equally not sure that the necessary elaboration (e.g., involving discontinuous functions) would improve things. Do we really need to be this detailed? Can we omit it?

Also, where I went to school, Turing machines were for digital computers only and slide rules were considered analog devices, since the calculations are continuous. (Try using a circular slide rule equivalent to about a hundred feet or so to see that you can get any gradation you can eyeball.) Simple voltmeters and ammeters are marked— does that make them digital? Digital computers manipulate symbols (1's and 0's for the Turing machine); analog computers manipulate physical values, e.g., voltages, amperages, amd waveforms. normxxx 13:28, 31 December 2005 (UTC)


Hey Guys, Please Review and Comment or Polish[edit]

My last changes, after my (digital) computer ate them one time, are substantial. And, I am definitely out-of-date. But in my youth (in the late '50s and early '60s, when analog radar was gradually converting to digital (by way of hybrid) and flight simulators were all analog, I was a hot shot analog computerist and could "simulate" systems (generally non-mathematically, although I liked using Boolean for switching circuits) in my sleep. For what it's worth, such direct analog translation from real-world phenomena to analog, then called 'simulation', is nothing less than modern object-oriented analysis! Le plus ça change, le plus c'est la même chose. normxxx 21:15, 31 December 2005 (UTC)

Digital != binary[edit]

From the article: In fact, digital also has a precise technical definition. In the context of circuits, it refers to the use of binary electrical pulse codes for symbols and the manipulation of these symbols in the operation of the digital computer.

Some memory technologies use more than two voltage levels. Some computers use(d) base-ten arithmetic; "binary" seems to indicate base-two arithmetic. The phrase "pulse codes" implies time-variance, whereas a computer's clock can be stopped and then continued without loss of data (unless it uses DRAM).

Perhaps "discrete voltage levels" should replace "binary electrical pulse codes."

I made the indicated changes; problem is, since a symbol is determined by its definition, almost anything can be a symbol. normxxx| talk email 04:32, 9 March 2006 (UTC)

I'm also not sure about equating electrical signals with symbols. Letters are symbols, but the strokes of a letter are not symbols. A byte can stand for 'a', but a bit in that byte is not usually a symbol. ---- User:Cphoenix (sig added by Cburnett)

A symbol is determined by its exact definition. normxxx| talk email 04:32, 9 March 2006 (UTC)
I don't see a problem with that change.
In the context of wireless transmissions, a symbol means something specific is a portion of an electrical signal. For your typical digital signal (think 7400 logic chip series) then you can consider a +5V as a 1 symbol and 0V as a 0 symbol. Or dominant & recessive symbols in on Controller Area Network. Cburnett 05:51, 16 February 2006 (UTC)

Photo[edit]

Can we get a photo of a mechanical implementation of an analog computer? Additioanl photos would also be appreciated. RJFJR 02:48, 9 March 2006 (UTC)

The Norden bombsight is a pretty sophisticated mechanical analog computer. Does either of the photos there suit you? --Carnildo 04:43, 9 March 2006 (UTC)
Thanks! That was an excellent suggestion. normxxx| talk email 09:23, 10 March 2006 (UTC)
How about a picture of a slide rule? If that's not a mechanical analog computer, then I don't know what is. -- RoySmith (talk) 13:50, 10 March 2006 (UTC)
I don't think the bombsight does much for the article. (We know it is analog - does the reader believe us?)? I think a planimeter, or slide rule, or op-amp integrator would better illustrate it. Wizzy 13:55, 10 March 2006 (UTC)

spelling[edit]

analog computer or analogue computer?

Depends on who you talk to. --Carnildo 05:12, 11 June 2006 (UTC)
'Analog' is the US spelling, 'Analogue' pretty much everywhere else.81.179.71.139 12:07, 3 April 2007 (UTC)

Unrestrained Verbiage Obfuscates Substance[edit]

The content is great but this reads like a college text book. I don't feel I'm an idiot, but I had to read the first paragraph three times before I understood exactly what was being said. Any 5th grader trying to find out what an analog computer is is going to be blown away. Wiki guidelines specify that articles should start out very simple, and progress to more difficult processes.

"An analog computer is a form of computer that uses electrical or mechanical phenomena to model the problem being solved by using one kind of physical quantity to represent another. The central concept among all analog computers can be better understood by examining the definition of an analogy. The similarities of an analogy define the salient characteristics of the comparison, but the differences in an analogy are also important. Modeling a real physical system in a computer is called simulation."

This could be said as follows:

"An analog computer is a computer that uses one physical quantity to represent another."

This paragraph could end right here and people would be more enlightened than had they read the whole entire first paragraph. The rest is tangential information that doesn't really help or clarify. How are knowing the differences important? In what way does knowing the definition of analogy help understand analog computers? These questions should be answered if this content is to be kept. The sentence about simulation is out of the blue. Epachamo 15:46, 27 July 2006 (UTC)

a very simple, effective analog computer example[edit]

I read this in a short letter replying to one of their articles in Scientific American, on the subject of analog computers, some time about 1985. Rem being completely blown out by it. The writer said he worked in some kind of Surveyor's Dept, and they had the job of providing accurate data on the areas of farms and the like with very complex borders. All traditional calculation methods involved splitting up a map of the land into regular-shaped plane figures, measuring those, and then doing the same with the leftover bits and so on until you judge the remaining bits to be negligible. All very fiddly and time consuming.

His group adopted a technique that was the essence of simplicity, effectiveness and cheapness. Say you have a map of some area with a very complex border,(let's use Norway with its fjord indented coast as an example), and you have to calculate its area. Simply put this map on a piece of cardboard and trace over Norway's borders. Now cut out the shape of Norway from the cardboard. Put this cardboard cutout on one cup of a fine balance scale, and put a (pre-prepared) square piece of cardboard representing say 1000 sq kilometres in the other. Now read out the difference in weight. This immediately tells you the exact area of Norway or any irregular shaped piece of land, in a matter of seconds.

Back in 1985, I worked next door to a Public Service office (the Lands Department) that had a section of about 20 people devoted full-time to just such calculations. When they heard about this idea, it was estimated that the entire section could be reduced to 6 people, saving millions of dollars and providing better approximations. Of course it was never done. But the idea still intrigues me.

I do agree that the definition of analog computer could be simplified and clarified at the same time, although it *is* one of those concepts that is hard to pin down in words. In the case described here, it can be clearly seen that there are two physical quantities involved, in this case, weight and surface area, and that these are linked by a very simple formula. One quantity is hard to measure (surface area) and the other is easy (weight). Because the two systems are linked, we can get a result for the difficult-to-measure quantity without having to directly measure it. Instead, we calculate the easy-to-measure quantity and *deduce* the hard-to-measure quantity from that.

I wish Wikipedia writers would proceed along these lines when it comes to explaining things. It goes to show that laypeople writing for laypeople does not necessarily make for clarity and brevity. In fact, it often does the reverse. It shows that writing clearly for a general readership is a specialised skill in itself, often undervalued. It may seem counterintuitive but the greatest blight on expository writing by amateurs for amateurs is not through their oversimplification of things, but the reverse, that through not being trained as teachers and communicators, they burden down the exposition with so much technical detail and minutiae, that the average reader is in over his head after a couple of pars.

Wikipedia is full of articles striving to be comprehensive and completely accurate, and forgetting that they are being read by ordinary Joes who just want to get the jist of the matter. Bit odd for a people's encyclopedia, don't you think? Back to basics, comrades, back to basics. Myles325


Speaking of examples, when I took flying lessons about 20 years back, we used a Flight Computer that was basically a purpose made slide rule. I believe that this is a type of analog computer stil in common ( among pilots) use. Since there is a page on this (E6B) , someone more savy about how wiki works and about analog computers may want to add it as an example and link it. 75.191.151.75 (talk) 18:29, 27 May 2010 (UTC)

Babbage[edit]

shouldn't Babbage's mechincally computer be listed in the timeline?

No, Babbage's mechanical computer was digital, not analog. --Gerry Ashton 15:02, 9 June 2007 (UTC)
Not possible to mention this? I think a lot of people might rightly confuse analog machines with mechanical ones? Just a thought 188.220.171.33 (talk) 17:03, 20 October 2010 (UTC)

Wind tunnel[edit]

The scaling required in a wind tunnel does not change whether it is an analogue computer or not; the pressure readings that are taken could be scaled and transformed within the analogue domain, if that was required (not done because it isn't useful, rather than because it isn't possible). I would argue that a wind tunnel is just as much of a computer as the hydrolic economy model; the important thing is that it can be used to compute a wide range of possibilities, based on its input; this differentiates it from a model, which computes only a single or small number of possibilities. Pog 16:05, 30 July 2007 (UTC)

Network Analyzer[edit]

AC power systems used to be modelled by a roomful of analog components simulating generators, transmission lines, loads, transformers, tapchangers,etc. - a scale-model, operating at much higher than power line frequency. These were used for load flow and fault study calculations before digital methods got to be more affordable. I've read some things on the Web that say these large network analyzers were sometimes borrowed by physicists for calculations in quantum theory - but I have no book on this, just some Web pages. Any network analyzer experts out there? --Wtshymanski 02:28, 4 December 2007 (UTC)

about computer type[edit]

could please distinguish between Analogue and Digital computer? —Preceding unsigned comment added by 84.66.40.33 (talk) 16:02, 4 January 2008 (UTC)

All the computers you are familiar with are digital computers. Internally, everything is done with binary numbers, which can assume one of two states, which can be called "one" and "zero", or "true" and "false". Analog computers process information as a continuously variable quantity, most often, a voltage. These quantities are always continuosly variable, even inside the computer. Digital computers can process continuously variable signals too, such as microphone inputs or speakers, but input signals must be converted to digital, processed digitally, then converted to analog before being output. --Gerry Ashton (talk) 17:15, 4 January 2008 (UTC)

Brains[edit]

Would it be worth mentioning animal brains in the article as naturally occurring analog computers? Wbrameld (talk) 05:12, 27 February 2008 (UTC)

Are they? --Carnildo (talk) 05:31, 27 February 2008 (UTC)
Not really; the signals in the brain are in some ways more digital than analog. It's really not either. Dicklyon (talk) 05:56, 27 February 2008 (UTC)
How about under Hybrid computers? The animal nervous system is truly an engineering marvel (like so much in nature). First, think of each neuron as a simple computer or as a complex gate (some of the central neurons can have as many as 10,000 connections, mostly input, or dendrites). Second, the neuron itself is something like an adder. That is, it sums its various neural inputs until it reaches its threshold firing level, upon which it discharges strongly and completely. While the neuron itself may also be viewed as a simple digital to analog converter (to get to its firing threshold), its output is again digital. Communication (across synapses) are by discrete little chemical packets which is what the neuron sums to get its output, which is again in the form of 'little chemical packets' whose number is related to the strength of its discharge. It gets more and more complicated. Some of those inputs (as in true gates) are inhibitive (they reduce, i.e., subtract from, the neuron's electrochemical potential). Others may simply potentiate— just make the neuron more sensitive to other input...
Why is this so marvelous technically? Because 1) Such a system of digital/analog does not require a grounding system (which can be a real pain in any humanly engineered system). 2) Does not allow circuit noise to build; as a rule, the worst noise from point A to B is no worse than the noisiest neuron in the path— which may have damn few, as some of those dendrites and axons are one or more feet in length). 3) Has the most phenominally efficient distributed power supply. 4) Is massively parallel and redundant.
Enough for today...Normxxx (talk) 00:30, 9 March 2008 (UTC)
The rate of fire of a neuron falls along a continuous spectrum. Information is encoded in the firing rates of neurons and in the synaptic weights, which are also continuous. The fact that discrete neuronal spikes and discrete molecules of neurotransmitters are involved doesn't make the system digital. It's like calling the AKAT-1 a digital computer because electricity is made of discrete electrons. Wbrameld (talk) 04:13, 24 March 2008 (UTC)

Curta[edit]

The Curta is a digital computer. I am removing the reference to it from this article. Pmartel (talk) 16:36, 7 February 2009 (UTC)

"Chemical" analog computers[edit]

I believe there are or at least were analog computers using chemical (electrochemical?) reactions in liquids to represent computations, which are not mentioned in the article, as far as I can tell. —141.150.23.3 (talk) 17:50, 1 August 2009 (UTC)

Optical Processing[edit]

What about Convolution, Cross-Correlation, and Fourier Transform? When were these first used? I couldn't find the answer and hoped it would be here. Fourier Transform can be performed with a simple lens, bypassing the O(N lg N) operations in a sequential digital processor, or the O(N) operations in a digital systolic array processor.

Convolution and Cross-Correlation can be performed with a Vender-Lugt filter as described in Introduction to Fourier Optics, p.182, McGraww-Hill, 1968.

_-T —Preceding unsigned comment added by 24.155.226.79 (talk) 05:25, 6 November 2009 (UTC)

As an undergraduate at USC, I worked with a professor, Alexander Sawchuk, who was doing optical Fourier transforms, . This was around 1974. You could search the literature for his papers. --Jc3s5h (talk) 15:55, 6 November 2009 (UTC)
Joe Goodman wrote the book mentioned above, 1968; in 1974, he was my advisor at Stanford. In the l960s and 1970s, there was a lot of optical image processing; digital computers became capable enough during the that time to gradually displace the optical techniques. Here is a paper on the history. And here is the history page in Joe's book. Dicklyon (talk) 16:23, 6 November 2009 (UTC)

Mee no understand big words[edit]

Could someone translate it so that a 13 year old can understand it? -Absorr (talk) 14:06, 18 January 2010 (UTC) —Preceding unsigned comment added by 72.49.197.231 (talk)


hi none of beeseswax got it cump!!!!! —Preceding unsigned comment added by 195.229.237.37 (talk) 13:19, 6 October 2010 (UTC)

Misuse of sources[edit]

Jagged 85 (talk · contribs) is one of the main contributors to Wikipedia (over 67,000 edits; he's ranked 198 in the number of edits), and practically all of his edits have to do with Islamic science, technology and philosophy. This editor has persistently misused sources here over several years. This editor's contributions are always well provided with citations, but examination of these sources often reveals either a blatant misrepresentation of those sources or a selective interpretation, going beyond any reasonable interpretation of the authors' intent. Please see: Wikipedia:Requests for comment/Jagged 85. I searched the page history, and found 20 edits by Jagged 85. Tobby72 (talk) 10:50, 12 June 2010 (UTC)

I took out some of the unsourced items that seemed to be primarily about promoting certain inventors or their cultures. If the remaining ones with sources are misusing the sources, that should be pointed out by someone. We shouldn't be disputing a section just because an editor who likes to promite Islamic contributions added stuff to it. Dicklyon (talk) 02:59, 18 June 2010 (UTC)

See WP:Jagged 85 cleanup for an overview of the problem.
Following is a summary of the edits to this article by Jagged 85 (each diff shows the result of several consecutive edits):

Each edit diff is at Cleanup5 (big page). Johnuniq (talk) 00:57, 9 April 2011 (UTC)

Computer?[edit]

The article on computers makes a big deal about programability and yet here devices like the astrolabe are mentioned which have inherently one function only and cannot be anywhere near turing complete. This is quite a huge discrepancy between the articles and if someone's feeling brave, this should be made clear in the article somewhere for consistency.188.220.171.33 (talk) 21:43, 21 October 2010 (UTC)

"Computer" also used to be the term used to describe the people whose job was to do computations. I don't know if these people were programmable, or Turing complete. Dicklyon (talk) 04:29, 22 October 2010 (UTC)
If the Computer article emphasizes programmability to that degree, then it needs to be fixed to clarify that this only applies to what we think of as "computers" nowadays, namely digital computers, not to everything that has been termed a "computer". Certainly it does not apply to the human occupation which the term once referred to, nor does it apply to most analog computers. --Colin Douglas Howell (talk) 07:02, 22 October 2010 (UTC)
true on the first point, but not true in terms of analogue computers- there needs to be a distinction between a computer and a calculator and the requirement for being able to change what is input into a computer is that defining feature. This is what makes a difference engine a calculator, and the Analytic engine a computer. Neither of these are "computers" in the modern sense of the word but fit into the two categories perfectly. 188.220.171.33 (talk) 14:42, 22 October 2010 (UTC)
In fact the people who were computers were neccesarily 'programmable' in that they could perform different tasks on data presented to them rather than only a specific manipulation, so the definition even extends to them. The Oxford Dictionary agrees with this definition too as far as I can tell. 188.220.171.33 (talk) 14:44, 22 October 2010 (UTC)
Actually, looks like the Computer article already accepts all these things as computers, where it says "A computer does not need to be electric, nor even have a processor, nor RAM, nor even hard disk. The minimal definition of a computer is anything that transforms information in a purposeful way." Dicklyon (talk) 02:25, 25 October 2010 (UTC)
Well, analog computers were programmable, too, but usually by way pluggable connections or such. Have you looked up "analog computer" in that dictionary? Dicklyon (talk) 16:47, 22 October 2010 (UTC)
In fact, the astrolabe, gun aiming devices, the Antithykera mechanism, none of these could perform any arbitrary manipulation of data and thus by Wiki's own definition are not true computers. In the interests of impartiality, I'm trying as best I can here to base the distinction between a computer and a calculator in available sources and this article is the in the minority for suggesting the opposite to all these other definitions. The lack of citations should be cause for concern as well? 86.153.101.255 (talk) 11:04, 24 October 2010 (UTC)
I don't think wikipedia gets to define what an analog computer is; if there's a definition of "computer" that conflicts with common usage of "analog computer", that would be the place to direct your desire to improve things. Also, as you say, adding some more good sources would be good. Here are some: [2]. Here's a great paper: [3]. Dicklyon (talk) 01:31, 25 October 2010 (UTC)
Actually, looks like the Computer article already accepts all these things as computers, where it says "A computer does not need to be electric, nor even have a processor, nor RAM, nor even hard disk. The minimal definition of a computer is anything that transforms information in a purposeful way." Dicklyon (talk) 02:27, 25 October 2010 (UTC)
The definition is uncited, and common sense tells us that it is incomplete and flawed. According to the definition, even a microphone (transforms soundwaves into electricity) or a lens (transforms lightwaves into a viewable image) are computers. I am sure there are computer experts who would never include the astrolabe et al. in the list as computers. 91.132.141.80 (talk) 15:06, 27 March 2011 (UTC)

fluidic computers[edit]

There's barely a mention of fluidic computers (as "hydraulic computers"), and no mention of their mass produced use -- old car (and airplane) transmissions.... --ssd (talk) 22:43, 5 April 2011 (UTC)

You know what to do! Cite it and write it, though back in the days I considered buying an automatic transmission from a junk yard just to tear it apart, I seem to recall that the controls had all sorts of little check valves, balls, springs, and cleverly carved passages - which would make it not a pure fluidic "computer". Not every closed loop controller is usefully described as a computer for that matter. --Wtshymanski (talk) 14:25, 6 April 2011 (UTC)
Fluidic logic devices are still used in industrial hydraulic control systems, but not to the level of complexity of an actual computer. Very few (probably only one) fluidic logic CPU was actually built. Ref: "A Fluid Logic Digital Computer", June, 1965, Sperry Univac[4]. True fluidic devices are mostly digital, not analog - there's a fluidic flip-flop device.[5] --John Nagle (talk) 15:15, 5 June 2011 (UTC)

Jagged stuff[edit]

We appear to have a consistency problem: The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere and dioptra... The Planisphere was a star chart astrolabe invented by Abū Rayhān al-Bīrūnī in the early 11th century William M. Connolley (talk) 08:44, 22 April 2011 (UTC)

analog computer[edit]

Hi to everybody,

i thought there was an error on this page: it's said computering devices instead of analog computer as said in the title, am i wrong ? — Preceding unsigned comment added by 78.233.218.32 (talk) 16:28, 29 December 2011 (UTC)

Antikythera mechanism, clockwork, and computers[edit]

From Clockwork A clockwork is the inner workings of either a mechanical clock or a device that operates in a similar fashion. Specifically, the term refers to a mechanical device utilizing a complex series of gears.[1][2] One of the earliest known examples of a clockwork mechanism is the Antikythera mechanism

My grandfather clock is clockwork, is it a computer? If not, and I think it's not, what distinguishes the Antikythera mechanism from my clock? 69.106.239.236 (talk) 05:20, 12 December 2012 (UTC)

Ah, I'll answer my own question: clockwork fails the continuously changeable aspects of physical phenomena criteria (in the 1st line of this article) as gears having cut teeth, or cogs are discrete, not continuously changeable. Thus the Antikythera mechanism is a model, a very elaborate model -- but not an analog computer. 69.106.239.236 (talk) 06:00, 12 December 2012 (UTC)

Trim[edit]

I've just taken out a lot of "essay-like" content, and trimmed the structure up. I think the result is better overall, but the article still needs work. One suggestion to think about is whether it would be best to separate out "general purpose electronic analog computers" from the rest as a new article. Snori (talk) 09:36, 21 December 2012 (UTC)

You don't think someone coming to this article might want more than just a bullet point list of factoids but perhaps some explanation of "how" analog computers work? --Wtshymanski (talk) 14:09, 21 December 2012 (UTC)
Perhaps, but the mishmash that's there currently doesn't do much to provide it. For the moment I'll leave this reverted, but I'll be working on a revision with some of the "how" retained - and with proper references. Snori (talk) 09:10, 23 December 2012 (UTC)