Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into learning latent models in graphical structures, focusing on scenarios where not all samples are available. The instructor explains the concept of causal sufficiency and presents an example of latent learning in an undirected graphical model. The lecture covers tree decomposable probability distributions, symmetric discrete distributions, and the conditions for a latent tree to be learnable. The instructor introduces the notion of distance among variables, based on correlation coefficients, and discusses the sibling grouping lemma, which helps classify nodes in a tree. The lecture concludes with a proposition by Pearl from 1988, stating that for every tree-composable distribution, there exists a minimal tree extension that can be recovered, particularly for Gaussian and binary distributions.