Talk:Entropy rate

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

"An alternative, related quantity"[edit]

It would seem to me that if the limit H' exists, then H must exist as well, and also they must be equal. If this is the case, it would make things more clear to say so in the article. 198.145.196.71 00:17, 14 September 2007 (UTC)[reply]

You'd have to prove that, as I don't think it holds in general, but I dunno. linas (talk) 03:07, 18 August 2008 (UTC)[reply]


REPLY: Yes, there is a theorem+proof in Clover and Thomas page 75 (see ref list) that for stationary stochastic processes, both limits exist and are equal. No mention of "strongly stationary" or anything like that. I suggest changing the relevant line to: "For stationary stochastic processes, both of these limits exist and they are are equal" and citing clover and thomas. But i'm not sure how to put that into the article. Could the next person who reads this and knows how to make inline citations please make the change? —Preceding unsigned comment added by 140.247.23.90 (talk) 17:25, 3 March 2010 (UTC)[reply]

Strongly stationary[edit]

I propose to change "Strongly Stationary" to either "Strictly Stationary" or just "Stationary". The last two are used most often in the Information Theory community. —Preceding unsigned comment added by Idaniel314 (talkcontribs) 11:19, 30 December 2007 (UTC)[reply]

Hmm, article should be clarified to define "strongly stationary", as opposed to just-plain-old stationary. linas (talk) 03:07, 18 August 2008 (UTC)[reply]

Merge?[edit]

I recently created the articles information source (mathematics) and Markov information source, which significantly overlap the material here (I wasn't aware of this article). A possible merge or re-arrangement might be in order. linas (talk) 03:07, 18 August 2008 (UTC)[reply]

Entropy rate of the human genome[edit]

Aside from the lack of a reliable source, I don't see the point of this example. The only reason to include an example here would be to help illuminate the definition for someone who doesn't quite get the whole picture from the mathematics. This example fails that test. Can anyone justify it's inclusion? If not, I'll remove it.

Mygskr (talk) 18:56, 9 November 2012 (UTC)[reply]

Introductions should introduce[edit]

This article leads with the sentence, "In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process." My bet is that even a majority of mathematically literate readers will not find this informative; especially since the Wikipedia article on stochastic processes does not seem to mention or define the concept of "time density" or "average information in a stochastic process." The mathematical definition of the limit that follows should be more clearly linked to the terms in this first sentence.

It seems to me that the main purpose of Wikipedia is to acquaint readers with subjects rather than serve as a repository of information for specialists. So I think that one should strive to make the introductory parts at the least clear and simple enough to be understood by people who possess background in the (in this case the mathematical and physical) sciences. Of course this takes some care and thought--much more than is required to brandish jargon. (I do not say this as a criticism of the present article but merely to emphasize the general idea that being informative is more important than displaying expertise.) Skinnerd (talk) 19:15, 23 January 2013 (UTC)[reply]


I don't know, but I believe "time density" is nothing more than something over time. In this case it could also imply something per outcome. "Average information in a stochastic process" is just another name for (Shanon's) entropy. If someone can confirm these, a clearer wording would be, for instance:
"In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the average information of a stochastic process transmitted over time."
or
"In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the average information per outcome in a stochastic process."
177.68.225.247 (talk) 01:10, 25 March 2017 (UTC)[reply]