Jump to content

Detailed balance

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Quantling (talk | contribs) at 16:30, 27 August 2010 (Partially revert my own edits: stationary processstationary distribution). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics and statistical mechanics, a Markov process is said to show detailed balance if the transition rates between each pair of states i and j in the state space obey

where P is the Markov transition matrix (transition probability), i.e., Pij = P(Xt = j | Xt − 1 = i); and πi and πj are the equilibrium probabilities of being in states i and j, respectively.[1]

The definition carries over straightforwardly to continuous variables, where π becomes a probability density, and P(s′, s) a transition kernel probability density from state s′ to state s:

A Markov process that satisfies the detailed balance equations is said to be a reversible Markov process or reversible Markov chain.[1]

The detailed balance condition is stronger than that required merely for a stationary distribution; that is, there are Markov processes with stationary distributions that do not show detailed balance.

Detailed balance implies that, around any closed cycle of states, there is no net flow of probability. For example, it implies that, for all a, b and c,

When a Markov process is reversible, its dynamics can be described in terms of an entropy function that acts like a potential, in that the entropy of the Markov process is always increasing, and reaches its maximum at the stationary distribution.[citation needed]

Detailed balance is a weaker condition than requiring the transition matrix to be symmetric, Pij = Pji or P(s′, s) = P(ss′). A symmetric transition matrix would imply that the uniform distribution over the states is automatically an equilibrium distribution. For continuous systems with detailed balance, it may be possible to continuously transform the coordinates until the equilibrium distribution is uniform, with a transition kernel which then is symmetric. In the discrete case, it may be possible to achieve something similar, by breaking the Markov states into a degeneracy of sub-states.

  1. ^ a b O'Hagan, Anthony; Forster, Jonathan (2004). "Section 10.3". Kendall's Advanced Theory of Statistics, Volume 2B: Bayesian Inference. New York: Oxford University Press. p. 263. ISBN 0 340 807520. {{cite book}}: Cite has empty unknown parameters: |laydate=, |separator=, |trans_title=, |laysummary=, |trans_chapter=, |chapterurl=, |month=, and |lastauthoramp= (help)

See also