Lecture

Machine Translation: Sequence-to-Sequence and Attention

Related lectures (54)
Deep Learning for NLP
Introduces deep learning concepts for NLP, covering word embeddings, RNNs, and Transformers, emphasizing self-attention and multi-headed attention.
Language Models: From Theory to Computation
Explores the mathematics of language models, covering architecture design, pre-training, and fine-tuning, emphasizing the importance of pre-training and fine-tuning for various tasks.
Deep Learning for NLP
Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Vision-Language-Action Models: Training and Applications
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Neural Machine Translation
Explores the evolution and challenges of Neural Machine Translation systems and the evaluation metrics used in this field.
Text Generation: Basics and Evaluation
Covers the basics of text generation and the challenges of evaluating generated text using content overlap metrics, model-based metrics, and human evaluations.
Sequence to Sequence Models: Overview and Attention Mechanisms
Explores sequence to sequence models, attention mechanisms, and their role in addressing model limitations and improving interpretability.
Machine Translation: Attention Mechanism
Explores the attention mechanism in machine translation, addressing the bottleneck problem and improving NMT performance significantly.
Trajectory Forecasting in Autonomous Vehicles
Explores trajectory forecasting in autonomous vehicles, focusing on deep learning models for predicting human trajectories in socially-aware transportation scenarios.
Language Modelling and Recurrent Neural Networks
Explores language modelling, RNNs, n-gram models, LSTMs, and bidirectional RNNs.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.