**Reversible Markov Chains**

Reversibility in Markov chains arises from Kolmogorov's criterion which demands that the product of transition rates over any closed loop of states must be the same for the chain to be reversible. A Markov process satisfies detailed balance equations if and only if it is a **reversible Markov process** or **reversible Markov chain**. A Markov process is said to have detailed balance if the transition probability, *P*, between each pair of states *i* and *j* in the state space obey

where *P* is the Markov transition matrix (transition probability), *i.e.*, *P*_{ij} = *P*(*X*_{t} = *j* | *X*_{t − 1} = *i*); and π_{i} and π_{j} are the equilibrium probabilities of being in states *i* and *j*, respectively. When Pr(*X*_{t−1} = *i*) = π_{i} for all *i*, this is equivalent to the joint probability matrix, Pr(*X*_{t−1} = *i*, *X*_{t} = *j*) being symmetric in *i* and *j*; or symmetric in *t* − 1 and *t*.

The definition carries over straightforwardly to continuous variables, where π becomes a probability density, and *P*(*s*′, *s*) a transition kernel probability density from state *s*′ to state *s*:

The detailed balance condition is stronger than that required merely for a stationary distribution; that is, there are Markov processes with stationary distributions that do not have detailed balance. Detailed balance implies that, around any closed cycle of states, there is no net flow of probability. For example, it implies that, for all *a*, *b* and *c*,

This can be proved by substitution from the definition. In the case of a positive transition matrix, the "no net flow" condition implies detailed balance.

Transition matrices that are symmetric (*P*_{ij} = *P*_{ji} or *P*(*s*′, *s*) = *P*(*s*, *s*′)) always have detailed balance. In these cases, a uniform distribution over the states is an equilibrium distribution. For continuous systems with detailed balance, it may be possible to continuously transform the coordinates until the equilibrium distribution is uniform, with a transition kernel which then is symmetric. In the case of discrete states, it may be possible to achieve something similar by breaking the Markov states into a degeneracy of sub-states.

Read more about this topic: Detailed Balance

### Other articles related to "reversible markov chain, reversible markov chains, markov chain, reversible":

... For an ergodic

**reversible Markov chain**with an underlying graph G, the conductance is a way to measure how hard it is to leave a small set of nodes ... Alistair Sinclair showed that conductance is closely tied to mixing time in ergodic

**reversible Markov chains**... Conductance is related to

**Markov chain**mixing time in the

**reversible**setting ...

### Famous quotes containing the word chains:

“He that is taken and put into prison or *chains* is not conquered, though overcome; for he is still an enemy.”

—Thomas Hobbes (1588–1679)