Expectation Maximization is a powerful method for performing maximum likelihood estimation in models with hidden (latent) variables. In this video I provide a brief tutorial on EM, where I show how it can be understood as coordinate ascent on the negative variational free energy (aka evidence lower bound or ELBO). The ELBO is a function of two arguments: a distribution on the latents, and the parameter values we're interested in. The E-step then corresponds to maximizing the ELBO with respect to the distribution, and the M-step corresponds to maximizing it with respect to the parameters.
The blog post containing my notes is here: https://sinatootoonian.com/index.php/2025/03/09/understanding-expectation-maximization-as-coordinate-ascent/