Lecture

Text Models: Word Embeddings and Topic Models

Description

This lecture covers probabilistic models for text, including word embeddings to capture word semantics, topic models for document sets, and an introduction to graphical models. It explains how word embeddings find relationships among words based on context, and how topic models classify document meanings without a query. The lecture also delves into the skip-gram model of Word2vec, negative sampling, and properties of word vectors. Additionally, it explores Bayesian Networks, their parameterization, and inference methods like Gibbs sampling. The lecture concludes with a discussion on the computational challenges in large models and the key ideas behind Bayesian Networks.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.