Lecture

Modern NLP: Classical Language Models

Description

This lecture introduces language models, including count-based models and fixed-context neural models. It covers evaluating language models, smoothing techniques, and the impact of language models on downstream applications. The instructor explains the concept of language models as probabilistic models of token sequences and discusses the chain rule for computing joint probabilities. The lecture also delves into n-gram models, smoothing methods like Laplace smoothing and absolute discounting, and the challenges faced by n-gram models. Finally, it explores fixed-context neural language models, their advantages, and limitations compared to traditional n-gram models.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.