Entropy rate

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:

H(X) = \lim_{n \to \infty} \frac{1}{n} H(X_1, X_2, \dots X_n)

when the limit exists. An alternative, related quantity is:

H'(X) = \lim_{n \to \infty} H(X_n|X_{n-1}, X_{n-2}, \dots X_1)

For strongly stationary stochastic processes, H(X) = H'(X). The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property.

Entropy rates for Markov chains[edit]

Since a stochastic process defined by a Markov chain that is irreducible and aperiodic has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by:

\displaystyle H(Y) = - \sum_{ij} \mu_i P_{ij} \log P_{ij}

where μi is the stationary distribution of the chain.

A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.

See also[edit]

References[edit]

  • Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0-471-06259-6 [1]

External links[edit]