In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:
when the limit exists. An alternative, related quantity is:
Entropy rates for Markov chains
where μi is the stationary distribution of the chain.
- Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0-471-06259-6 
- Systems Analysis, Modelling and Prediction (SAMP), University of Oxford MATLAB code for estimating information-theoretic quantities for stochastic processes.