Lecture

Neural Network Syntactic Parsing

Related lectures (37)
Syntactic Parsing and Graph2Graph Transformers
Covers syntactic parsing, dependency grammar, and graph-based neural network parsing.
Syntactic Parsing: Dependency Structure
Covers syntactic structure, dependency parsing, and neural network transition-based parsing, highlighting the importance of dependency structure in linguistic analysis.
Vision-Language-Action Models: Training and Applications
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Long Short-Term Memory Networks
Introduces Long Short-Term Memory (LSTM) networks as a solution to vanishing and exploding gradients in recurrent neural networks.
Neural Networks for NLP
Covers modern Neural Network approaches to NLP, focusing on word embeddings, Neural Networks for NLP tasks, and future Transfer Learning techniques.
Deep Learning for Autonomous Vehicles: Learning
Explores learning in deep learning for autonomous vehicles, covering predictive models, RNN, ImageNet, and transfer learning.
Language Modelling and Recurrent Neural Networks
Explores language modelling, RNNs, n-gram models, LSTMs, and bidirectional RNNs.
Deep Learning for Question Answering
Explores deep learning for question answering, analyzing neural networks and model robustness to noise.
Seq2Seq Models: Attention vs. No Attention
Explores Seq2Seq models with and without attention mechanisms, covering encoder-decoder architecture, context vectors, decoding processes, and different types of attention mechanisms.
Language Models: From Theory to Computation
Explores the mathematics of language models, covering architecture design, pre-training, and fine-tuning, emphasizing the importance of pre-training and fine-tuning for various tasks.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.