Lecture

Modern NLP: Classical Language Models

Description

This lecture introduces language models, including count-based models and fixed-context neural models. It covers evaluating language models, smoothing techniques, and the impact of language models on downstream applications. The instructor explains the concept of language models as probabilistic models of token sequences and discusses the chain rule for computing joint probabilities. The lecture also delves into n-gram models, smoothing methods like Laplace smoothing and absolute discounting, and the challenges faced by n-gram models. Finally, it explores fixed-context neural language models, their advantages, and limitations compared to traditional n-gram models.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.