I like the terminology of digital being a transmission of symbols. - Omegatron 21:52, May 24, 2005 (UTC)
the human eye 
The article currently claims
- the human eye may be able to detect tens of thousands of different intensities of pure green
Really? I've been told that humans cannot distinguish even 256 intensities of pure green. See "How many bits do I need to smoothly shade from black to white?" by Charles Poynton. (Should this be posted on the eye article ?) --DavidCary 09:37, 26 Jun 2005 (UTC)
- I agree, and changed the example. But the article misses an important point, and I'm not sure how to best add it in: Compared with digital, analog has higher resolution but lower accuracy (due to noise, amplifier distortion, etc.). Compared with analog, digital has lower resolution but higher accuracy. In a sense, using digital sacrifices resolution for accuracy. --Rick Sidwell 17:14, 22 July 2005 (UTC)
Removed text 
I removed the following text because, while probably true, it doesn't really apply to the subject at hand. --Rick Sidwell 01:04, 6 September 2005 (UTC)
- It should be noted that photographic film is not perfect, being subject to aberrations. Losses in analog systems are often modelled as a noise spectrum and modulation transfer function (MTF). The MTF of many analog systems, including film, typically "rolls off" with increasing frequency.
Nits, and the picking thereof 
I think the article accurately reflects the way the term is most commonly used, but the comparison of digital to discrete and the comparison of digital to analog are flawed if you're being very precise.
The problem is that there is a tacit assumption when calling digital "numbers" and that is the assumption of fixed length representation inherent in computer system design.
In the limit, digital and analog systems have the same resolution and the same accuracy. In practice, most of the time, people are more familiar with situations, like audio recording, where analog systems behave as described above. But there are other, less common situations, like solutions of certain partial differential equations, where the analog systems are both higher resolution and more accurate. The most extreme example of this is the use of a windtunnel versus a computer to simulate fluid flow. Both are models. The first is analog, the second digital. The analog system is more accurate and of a higher resolution than the digital system.
YUp so YUp loser hahahahaha
The entry begins with:
- A digital system is one that uses numbers, especially binary numbers, for input, processing, transmission, storage, or display, rather than a continuous spectrum of values (an analog system) or non-numeric symbols such as letters or icons.
The entry then concludes with a catalogue of non-numeric symbols, which are identified as "Historical Digital Systems." Isn't this a bit incoherent? I think the entry would over-all benefit by concentrating on the discrete nature of digital encoding (as opposed to the non-discrete nature of analogue).
I'd also like to query the supposition that digital encoding necessarily implies some form of numerical representation. I may be out on a limb here (I'm not a mathematician) but - while strictly speaking it's true that any enumerable quantity is a number - isn't it a bit misleading to say that "digital" fundamentally or necessarily implies the representation of number? Doesn't that privilege one possible interpretation of a digital encoding? Wouldn't it be more correct to say that number can be represented digitally; i.e. the "digital representation of number," but that a digital encoding can easily represent a letter in the alphabet (ASCII), part of a sound recording, etc?
I agree that the quoted text is confusing and likely not useful. I believe the point they were making is that the values encoded in a digital format will be discreet values. These values can be called numbers or anything else, but the point is that in, say, a digitally encoded signal, the signal will represent a discreet value, i.e. either 1 or 2, and the recieving device will so-interpret it. This is in contrast to analogue signals which will be interpreted as a particular value (limited by the degree of the recieving devices percision in measuring that value) along a continuous set of values. So the analogue signal could be interpreted as, i.e., anything from 0 to 1.
The point is that analogue may be interpreted as a number as well, but it can be interpreted as any number within the range, and it will not be perfectly interpreted as perfect percision is impossible. So its only a particular number by virtue of the limits on the percision of the interpreting device, in reality it is a value not expressed by any number of finite length, just like the weight of a pencil. So the analogue signal interpreted as, say, 1 isn't really 1, but rather rounded to 1 at some degree of percision, whereas the digital interpreted as 1 is just that number with no rounding or percision to worry about. In that way, digital is a discrete value, and analog is an approximation of some point on a continuous line. --220.127.116.11 (talk) 03:45, 7 October 2009 (UTC)
Confused? Me too! 
I think the "confused" entry is making a great point. We really need to talk about the fundamental differences between digital and analog in this article; and we have not achieved this yet.
The concept of digital is very well defined and very straight forward. It has nothing to do with numbers, computers, binary numerals, transmission, storage, or display of data or signals. This is a laymen's definition. And that might very well be ok for this article since it's a high-level definition of the term. But something more is needed.
The opening three paragraphs are, in fact, inaccurate. Most digital systems so not use binary numbers to represent data. And the first sentence of paragraph two is just wrong. 'The distinction of "digital" versus "analog"' has nothing to do with the "method of input, data storage and transfer, or the internal working of a device." Though, "internal workings of a device" is a vague term for sure. These terms are all so vague.
Can't we talk about the concept of having a pre-arranged alphabet of symbols, which reduces randomness in the system? And that REGENERATION of the original data, rather than amplification, is really what makes a system digital. Analog systems can only amplify and reproduce. Digital systems can RECREATE the original data. These are fundamental concepts that are at the heart of the definition of digital.
- The comment you're reacting to is over a year old. If you have a good improvement over the present definition, let's see it. I agree that the especially binary numbers is quite lame; the rest is not bad as general-audience definitions go, I think, but I agree it can be improved. Note that your complaint about the distinction having nothing to do with input, output, etc., is a bit off base, since it does not imply that those uses are part of the distinction. Dicklyon 06:11, 6 April 2007 (UTC)
'Unreferenced' tag 
An article such as this should be capable of being written by a reasonably competant engineer without refence to other material. Unfortunatly, this article is very poor in it current state. It tries to be too complicated.
It has a number of major problems. For example, there is a section on noise being introduced to the digital signal, but it totally fails to mention the single largest source of noise - the digitisation process itself.
It also gives an example of a digital system - the smoke signals. Fair enough, the smoke (or absence thereof) is a fair example of digital communication. But it is spoilt because it refers to the smoke itself as "an analogue carrier". It is not an analogue carrier precisely because it does not represent anything else (i.e. it is not an analogue).
For morse code it refers to five digital states. Not so. There are two digital states, the presence or absence of an electric current or carrier. What the various intervals represent is purely a matter of interpretation.
For the modem it refers once again to an 'analogue carrier signal' that is most definitely not an analogue. This discussion is by no means exhaustive.
Should I get the time, I will try to provide a better article, but it won't be anytime real soon. I B Wright 10:37, 8 August 2007 (UTC)
I think both the analog and digital column need pictures. Show a smoothly varying soundwave, and then show a discrete approximation of it. —Preceding unsigned comment added by 18.104.22.168 (talk) 00:06, 24 October 2007 (UTC)
A "digital" system is more abstract than what is presented in this article. John Haugeland defines a digital system as a "set of positive and reliable techniques (methods, devices) for producing and reidentifying tokens, or configurations of tokens..." I think someone should start from something like this, and then move into examples of more concrete digital systems, and digital vs. analog... —Preceding unsigned comment added by 22.214.171.124 (talk) 04:31, 5 November 2007 (UTC)
Morse code - symbols or states? 
The Morse Code entry claims that there are five states, which is incorrect. There only two states, on or off, but they are used to represent 5 symbols - dot, dash, intra-character gap (between each dot or dash), short gap (between each letter), medium gap (between words), and long gap (between sentences).
These symbols are then grouped into larger symbols - letters, numbers, etc.
On another topic - I also think it would be worthwhile describing the difference between quantizing time (discrete-time systems) and quantizing signal (A/D conversion etc.)
Lack of sources 
The reason why there isn't many sources (if any) to the way digital is being referred to on the wikipedia is because "digital" is being used to describe raw data and not data processing. First we have to understand that "digital" is meaningless if it doesn't pertain to information. All data in it's purest form is analog, and is only made digital by the way it's processed. The difference between analog data and digital data is the same difference between data and information. Therefore, data is only digital as it pertains to information, not raw data. It makes perfect sense when you see digital data as the subset of analog data that it is. But when you try to reverse it and treat analog data as if it somehow intersects or is a subset of digital data, it's no wonder that it's vague and confusing. It's contrary to logic. Oicumayberight (talk) 19:18, 14 April 2009 (UTC)
digital network 
I typically use the phrase "digital network" to mean something like this:
- digital network: a digital telecommunications network, where the information travels from one enclosure to another in digital form. Such networks sometimes use non-electronic components -- for example, Erbium-doped fiber amplifier -- as well as digital electronics.
While the terms "digital" and "network" may apply to the way digital components are connected in digital electronics -- for example, the way logic gates and hardware registers are connected in a electrical network inside a single IC, perhaps represented as a and-inverter graph, to make a microprocessor -- I don't know anyone that would call a single integrated circuit a "digital network".
Start by distinguishing digital network from analog network. That will make it more obvious which article the term should be redirected too. If there is none, consider whether or not the term "digital network" serves to clarify anything or is merely another buzzword. My guess is that computer network would be the closest thing. Oicumayberight (talk) 18:42, 2 June 2009 (UTC)
I strongly disagree that DNA is a proper example of a "Historical digital system." With the exception of DNA, all the other examples are of digital systems designed and created by humans. DNA, on the contrary, is a product of a mechanistic, natural, unplanned, undesigned, and purposeless process, namely biological evolution. The other examples given of historical digital systems were planned and created for a human purpose by an intelligent process, that is, they are teleological. DNA, however, results from a teleonomic process, one that operates naturally without planning, forethought, purpose, or intelligence. Teleological and teleonomic systems are different in kind or quality, not degree or quantity, so DNA is a data or information system different in kind from a historical data or information digital system created by humans. Thus, it should not be used as an example in a digital context.
The significant difference is that DNA is not digital, contrary to the intent of the DNA example, which is not written incorrectly if read outside of a digital context. Digital means using data in the form of numerical digits, or data expressed in numerical form, or representing or operating on data or information in numerical form (three typical definitions of "digital" in the context of data or information storage, transmission, and manipulation). DNA is not data or information in numerical form, but discrete or discontinuous data in chemical form. Although DNA can be described in numerical form, i.e. a triplet codon, this is not a real numerical attribute but rather an artificial one applied to it by a human description long after its first appearance and the subsequent human discovery of its structure. DNA is a code that stores and transmits data or information. The DNA code consists of discontinuous and discrete nucleotide bases arranged in unique sequences but without a real numerical (and therefore digital) quality. Having the properties of being discrete and discontinuous does not automatically give a system a digital quality or structure; this can only be provided by possessing a numerical property. DNA has no relationship with digital information using ones and zeros or the base 10 numerical system. DNA stores and transmits information using non-numerical and non-digital information using molecular chemical compounds. The nucleotide bases can be symbolized numerically by purposeful application of a human-created alphanumeric system, but the DNA coding system itself has no original and inherent numeric property. So, a molecular coding system is significantly different from a digital coding system. In both substance and origin, DNA is not digital.
Since this is the case, I suggest that the DNA example is incorrectly included in the list of "historical digital systems." It should be removed. I plan to remove it but will wait a week to see what arguments are presented here in the discussion section. If someone can convince me that I'm wrong, I will not remove it. If others write here with more arguments to agree with me, thank you. If an administrator or sysop who monitors this section agrees with me now and wants to remove the DNA example, please be my guest. In the meantime, I ask a WP editor or admin to place the appropriate "controversial" or "disputed" icon in front of the DNA example (simply because I don't know how to do this!). Steven (talk) 08:07, 9 August 2009 (UTC)
I agree with Steven's principle claim that DNA is not digital. Even if an argument can be made in favor of its inclusion, it can reasonably be argued to the contrary as Steven has done. Therefore, based on the useful, ample, and unambiguous list of examples already provided, the article in no way suffers having DNA removed. Its inclusion, on the other hand, begs the question at best and is speculative at worst; and, therefore, dilutes the quality of an otherwise informative and concise article.Wolfworks (talk) 04:07, 7 November 2010 (UTC)
- I think there is a sense in which DNA is digital. In fact, there has to be in order that desirable traits are maintained for thousands of generations (whereas analog copying would cripple the descendants). However, there are lots of ways in which DNA is not digital per the comments above, and the topic is not helpful to this article (and DNA does not claim "digital"). I have removed the item. Johnuniq (talk) 04:32, 7 November 2010 (UTC)
The information of DNA is digital but it is encoded using chemicals rather than electrical signals or binary or base 10 digits. However, I do not know if that is enough to constitute a system in the sense of this article. Richard Dawkins and other geneticists and biologists have also described DNA as digital information. http://books.google.com/books?id=ZudTchiioUoC&pg=PA97&lpg=PA97&dq=DNA+digital+richard+dawkins&source=bl&ots=vU7nw8KLs3&sig=eIKe5YVu6GU84OPh-ZWK0WKLdy8&hl=en&ei=jXKSTanaHdG3tgeun5Rv&sa=X&oi=book_result&ct=result&resnum=6&ved=0CDkQ6AEwBQ#v=onepage&q&f=false - Anon 29 March 2011 —Preceding unsigned comment added by 126.96.36.199 (talk) 00:29, 30 March 2011 (UTC)
There is an obvious problem on Modem that most people think a Modem converts digital signals to analog signals. The modem article is already too long to include a tutorial on the difference between analog and digital signals, and there isn't any good place to reference -- so any suggestions would be welcome.188.8.131.52 (talk) 08:00, 8 September 2010 (UTC)