Sequence to Sequence Models: Overview and Attention Mechanisms
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Explores trajectory forecasting in autonomous vehicles, focusing on deep learning models for predicting human trajectories in socially-aware transportation scenarios.
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Explores Seq2Seq models with and without attention mechanisms, covering encoder-decoder architecture, context vectors, decoding processes, and different types of attention mechanisms.