Given a statistical model consisting of a set of observed data, a set of unobserved latent data or missing values, and a vector of unknown parameters, along with a likelihood function, the maximum likelihood estimate (MLE) of the unknown parameters is determined by the marginal likelihood of the observed data
However, this quantity is often intractable.
The EM algorithm seeks to find the MLE of the marginal likelihood by iteratively applying the following two steps:
- Expectation step (E step): Calculate the expected value of the log likelihood function, with respect to the conditional distribution of given under the current estimate of the parameters :
- Maximization step (M step): Find the parameter that maximizes this quantity:
Note that in typical models to which EM is applied:
- The observed data points may be discrete (taking values in a finite or countably infinite set) or continuous (taking values in an uncountably infinite set). There may in fact be a vector of observations associated with each data point.
- The missing values (aka latent variables) are discrete, drawn from a fixed number of values, and there is one latent variable per observed data point.
- The parameters are continuous, and are of two kinds: Parameters that are associated with all data points, and parameters associated with a particular value of a latent variable (i.e. associated with all data points whose corresponding latent variable has a particular value).
However, it is possible to apply EM to other sorts of models.
The motivation is as follows. If we know the value of the parameters, we can usually find the value of the latent variables by maximizing the log-likelihood over all possible values of, either simply by iterating over or through an algorithm such as the Viterbi algorithm for hidden Markov models. Conversely, if we know the value of the latent variables, we can find an estimate of the parameters fairly easily, typically by simply grouping the observed data points according to the value of the associated latent variable and averaging the values, or some function of the values, of the points in each group. This suggests an iterative algorithm, in the case where both and are unknown:
- First, initialize the parameters to some random values.
- Compute the best value for given these parameter values.
- Then, use the just-computed values of to compute a better estimate for the parameters . Parameters associated with a particular value of will use only those data points whose associated latent variable has that value.
- Iterate steps 2 and 3 until convergence.
The algorithm as just described monotonically approaches a local minimum of the cost function, and is commonly called hard EM. The k-means algorithm is an example of this class of algorithms.
However, we can do somewhat better by, rather than making a hard choice for given the current parameter values and averaging only over the set of data points associated with a particular value of, instead determining the probability of each possible value of for each data point, and then using the probabilities associated with a particular value of to compute a weighted average over the entire set of data points. The resulting algorithm is commonly called soft EM, and is the type of algorithm normally associated with EM. The counts used to compute these weighted averages are called soft counts (as opposed to the hard counts used in a hard-EM-type algorithm such as k-means). The probabilities computed for are posterior probabilities and are what is computed in the E step. The soft counts used to compute new parameter values are what is computed in the M step.
Read more about this topic: Expectation–maximization Algorithm
Other articles related to "description":
... using descriptive language, and organizing the description are the rhetorical choices to be considered when using a description ... A description is usually arranged spatially but can also be chronological or emphatic ... The focus of a description is the scene ...
... He gives a vivid and accurate description of the last colony of the European Beaver in Wales on the River Teifi, but spoils it by repeating the legend that beavers castrate themselves to ... Likewise he gives a good description of an Osprey fishing, but adds the mythical detail that the bird has one webbed foot ... His description of Irish wildlife was harshly called "worthless" the better view perhaps is that despite its faults it gives a valuable glimpse of Irish fauna ...
... Universal Description, Discovery and Integration (UDDI, pronounced Yu-diː) is a platform-independent, Extensible Markup Language (XML)-based registry by which businesses worldwide can list themselves on ... by SOAP messages and to provide access to Web Services Description Language (WSDL) documents describing the protocol bindings and message formats required to ...
... Unlike the keywords attribute, the description attribute is supported by most major search engines, like Yahoo! and Bing, while Google will fall back on this tag when information about ... The description attribute provides a concise explanation of a Web page's content ... This allows the Web page authors to give a more meaningful description for listings than might be displayed if the search engine was unable to automatically create its own ...
Famous quotes containing the word description:
“Once a child has demonstrated his capacity for independent functioning in any area, his lapses into dependent behavior, even though temporary, make the mother feel that she is being taken advantage of....What only yesterday was a description of the childs stage in life has become an indictment, a judgment.”
—Elaine Heffner (20th century)
“He hath achieved a maid
That paragons description and wild fame;
One that excels the quirks of blazoning pens.”
—William Shakespeare (15641616)
“Whose are the truly labored sentences? From the weak and flimsy periods of the politician and literary man, we are glad to turn even to the description of work, the simple record of the months labor in the farmers almanac, to restore our tone and spirits.”
—Henry David Thoreau (18171862)