Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers Gaussian Mixture Models (GMM) and the concept of marginal likelihood. It explains the advantages of treating latent variables in GMM and how to marginalize them out to obtain a cost function independent of these variables. The Expectation-Maximization (EM) algorithm is presented as an elegant method to optimize GMMs, involving an iterative two-step procedure. The lecture details the computation of the marginal likelihood and the steps involved in the EM algorithm, including the Expectation and Maximization steps.