Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture introduces language models, including count-based models and fixed-context neural models. It covers evaluating language models, smoothing techniques, and the impact of language models on downstream applications. The instructor explains the concept of language models as probabilistic models of token sequences and discusses the chain rule for computing joint probabilities. The lecture also delves into n-gram models, smoothing methods like Laplace smoothing and absolute discounting, and the challenges faced by n-gram models. Finally, it explores fixed-context neural language models, their advantages, and limitations compared to traditional n-gram models.