A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (hidden) states. An HMM can be considered as the simplest dynamic Bayesian network. The mathematics behind the HMM was developed by L. E. Baum and coworkers. It is closely related to an earlier work on optimal nonlinear filtering problem (stochastic processes) by Ruslan L. Stratonovich, who was the first to describe the forward-backward procedure.
In simpler Markov models (like a Markov chain), the state is directly visible to the observer, and therefore the state transition probabilities are the only parameters. In a hidden Markov model, the state is not directly visible, but output, dependent on the state, is visible. Each state has a probability distribution over the possible output tokens. Therefore the sequence of tokens generated by an HMM gives some information about the sequence of states. Note that the adjective 'hidden' refers to the state sequence through which the model passes, not to the parameters of the model; even if the model parameters are known exactly, the model is still 'hidden'.
Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.
A hidden markov model can be considered a generalization of a mixture model where the hidden variables (or latent variables), which control the mixture component to be selected for each observation, are related through a Markov process rather than independent of each other.
Other articles related to "hidden markov model, hidden markov models, models, markov model, hidden, markov":
... Modern general-purpose speech recognition systems are based on Hidden Markov Models ... These are statistical models that output a sequence of symbols or quantities ... Speech can be thought of as a Markov model for many stochastic purposes ...
... In the hidden Markov models considered above, the state space of the hidden variables is discrete, while the observations themselves can either be discrete (typically ... Hidden Markov models can also be generalized to allow continuous state spaces ... Examples of such models are those where the Markov process over hidden variables is a linear dynamical system, with a linear relationship among related variables and where all hidden and observed ...
... They are modelled on a Markov chain built on linear operators perturbed by Gaussian noise ... operator mixed with more noise generates the observed outputs from the true ("hidden") state ... may be regarded as analogous to the hidden Markov model, with the key difference that the hidden state variables take values in a continuous space (as opposed to a discrete state space ...
Famous quotes containing the words model and/or hidden:
“When you model yourself on people, you should try to resemble their good sides.”
—Molière [Jean Baptiste Poquelin] (16221673)
“There comes a time in every rightly constructed boys life when he has a raging desire to go somewhere and dig for hidden treasure.”
—Mark Twain [Samuel Langhorne Clemens] (18351910)