This lecture covers the concept of topic models, focusing on Latent Dirichlet Allocation (LDA). It starts with an introduction to clustering tasks and K-means algorithm, then delves into density estimation and Gaussian mixture models. The lecture explains the process of learning in GMMs and the limitations of non-parametric density estimation. It further explores the Dirichlet distribution and its role in LDA, discussing the generative process, learning, and approximate inference methods. The lecture concludes with a discussion on the evaluation of LDA models, its application in digital humanities, and extensions like correlated topic models.