# Difference between revisions of "User:Timothee Flutre/Notebook/Postdoc/2012/08/16"

Project name <html><img src="/images/9/94/Report.png" border="0" /></html> Main project page
<html><img src="/images/c/c3/Resultset_previous.png" border="0" /></html>Previous entry<html>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</html>Next entry<html><img src="/images/5/5c/Resultset_next.png" border="0" /></html>

## Variational Bayes approach for the mixture of Normals

• Motivation: I have described on another page the basics of mixture models and the EM algorithm in a frequentist context. It is worth reading before continuing. Here I am interested in the Bayesian approach as well as in a specific variational method (nicknamed "Variational Bayes").

• Data: we have N univariate observations, , gathered into the vector .

• Assumptions: we assume the observations to be exchangeable and distributed according to a mixture of K Normal distributions. The parameters of this model are the mixture weights (), the means () and the precisions () of each mixture components, all gathered into . There are two constraints: and .

• Observed likelihood:

• Latent variables: let's introduce N latent variables, , gathered into the vector . Each is a vector of length K with a single 1 indicating the component to which the observation belongs, and K-1 zeroes.

• Augmented likelihood:

• Priors: we choose conjuguate ones
• for the parameters: and
• for the latent variables: and