Talk:Information theory

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Eddideigel (talk | contribs) at 00:27, 9 May 2012 (→‎Black holes vs conservation of information: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Untitled

This article is also assessed within the mathematics field Probability and statistics.

log probability

Why does log probability redirect here? —Preceding unsigned comment added by 199.46.198.230 (talk) 14:12, 18 March 2009 (UTC)[reply]

because -ln(p) = I (I representing amount of information) Kevin Baastalk 16:10, 13 January 2011 (UTC)[reply]

thermodynamic entropy

I'd like to add a section on the relationship of Shannon's information to thermodynamic entropy in the section on applications to other fields. It would draw heavily from Ben-Naim's book A Farewell to Entropy: Statistical Thermodynamics Based on Information, but would also mention the impact of information theory on the resolution of Maxwell's Demon. It would probably just be 1-2 paragraphs, but I might develop it into a separate article. Thoughts? Maniacmagee (talk) 22:01, 21 August 2009 (UTC)[reply]

technical note about log

I am following through the math, and there is a seeming discrepancy that should be corrected (if it is an error) or explained. The entropy of the random variable X involves the log function. For the general definition, just "log" is written, which usually means natural log in most fields. Later, log2 is written. I presume log2 is meant in both cases and that two notational conventions are represented here. I think the article should be fixed to use "log2" consistently. I don't want to change it before getting a response because everything I know about this entropy function I know from this article. Thoughts? Tbonepower07 (talk) 04:21, 16 February 2010 (UTC)[reply]

It is said in "Quantities of information" that "The choice of logarithmic base in the following formulae determines the unit of information entropy that is used." In computer science, log2 is the most natural logarithm. Similarly, in the case of the binary entropy function, you are most commonly interested in log2 since the function is the entropy of a value that can take only two values. However, the article could and probably should use the natural log all over. Nageh (talk) 14:40, 23 March 2010 (UTC)[reply]

Overview

Having practically no experience as a contributor I hesitate to rewrite a section. On the other hand, having had a course in "Information Theory" at MIT in the 1950's, I believe that the Overview section has some serious errors. Should I undertake to rewrite large portions of that section?

"The main concepts of information theory can be grasped by considering the most widespread means of human communication: language." True enough, but the rest of the paragraph describes the main concepts of coding theory, not of information theory. The main concept of information theory is that a platitude such as "Thank you; come again" conveys less information the urgent plea, "Call an ambulance!" not because it is less important, but because it is less unexpected. In context, however, either of these messages might convey very little information, the former because it is not unexpected at the end of a transaction, the latter because if someone is obviously injured, "it goes without saying" that one should call an ambulance. The next main concept of information theory is that the speech channel has a capacity limit. If you talk faster, you convey information at a higher rate, unless you talk so fast that your hearer can't understand you. Symbols are being transmitted at a faster rate, but they are being received with errors; the channel is noisy.

Again "The central paradigm of classical information theory" is not "the engineering problem of the transmission of information over a noisy channel." That is the central paradigm of coding theory. The central paradigm of classical information theory is the quantification of information and of the capacity of an information carrying channel (which may be noisy or noiseless).

It might also be appropriate in the Overview section to introduce some simple quantitative concepts, for example, to define bits in terms of binary digits, to point out that capacity is defined for noiseless as well as noisy channels, or even to mention, by way of illustration, that when a fair and balanced coin is flipped, a message revealing which side came up conveys exactly one bit of information. Marty39 (talk) 21:06, 30 September 2011 (UTC)[reply]

Welcome! It is quite some time since I looked at this article, but I believe my thoughts were much the same as what you have noted. Please edit as you see fit—don't worry about mistakes and formatting issues as they can be fixed. I see that you know citations are needed; if you like, just put the info after the text in brackets and someone will tweak it. Johnuniq (talk) 00:10, 1 October 2011 (UTC)[reply]
As already mentioned, you're encouraged to contribute. Please also feel free to make contributions to the coding theory article, which presently needs to be rewritten completely. Isheden (talk) 11:17, 1 October 2011 (UTC)[reply]

The latest Nobel prize in economics went to economists who used information theory to build a model of how people respond to government policies. Perhaps this means economics should be listed in the opening section as a field in which information theory has been applied. — Preceding unsigned comment added by 68.16.142.186 (talk) 23:00, 2 January 2012 (UTC)[reply]

Black holes vs conservation of information

In this astrophysics series on the Discovery Channel they said something about black holes "...breaking the fundamental law of conservation of information" and some quarrelling between theoretical physicists over black holes. I had never before heard of such a fundamental law and I thought it sounded very strange. So I went looking for it and what I found on Wikipedia had little to do with physics as far as I could see. Can someone point me to an article better describing what the TV series means with conservation of information, or is the series just misinformed? Thanks! Eddi (Talk) 00:27, 9 May 2012 (UTC)[reply]