Markov Chain - Markov Chains - Steady-state Analysis and Limiting Distributions - Steady-state Analysis and The Time-inhomogeneous Markov Chain

Steady-state Analysis and The Time-inhomogeneous Markov Chain

A Markov chain need not necessarily be time-homogeneous to have an equilibrium distribution. If there is a probability distribution over states such that

for every state j and every time n then is an equilibrium distribution of the Markov chain. Such can occur in Markov chain Monte Carlo (MCMC) methods in situations where a number of different transition matrices are used, because each is efficient for a particular kind of mixing, but each matrix respects a shared equilibrium distribution.

Read more about this topic:  Markov Chain, Markov Chains, Steady-state Analysis and Limiting Distributions

Famous quotes containing the word chain:

    The years seemed to stretch before her like the land: spring, summer, autumn, winter, spring; always the same patient fields, the patient little trees, the patient lives; always the same yearning; the same pulling at the chain—until the instinct to live had torn itself and bled and weakened for the last time, until the chain secured a dead woman, who might cautiously be released.
    Willa Cather (1873–1947)