Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers probabilistic models for text, including word embeddings to capture word semantics, topic models for document sets, and an introduction to graphical models. It explains how word embeddings find relationships among words based on context, and how topic models classify document meanings without a query. The lecture also delves into the skip-gram model of Word2vec, negative sampling, and properties of word vectors. Additionally, it explores Bayesian Networks, their parameterization, and inference methods like Gibbs sampling. The lecture concludes with a discussion on the computational challenges in large models and the key ideas behind Bayesian Networks.