Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Natural Language Generation: Understanding the Task and Techniques
Graph Chatbot
Related lectures (32)
Previous
Page 1 of 4
Next
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Deep Learning for NLP
Introduces deep learning concepts for NLP, covering word embeddings, RNNs, and Transformers, emphasizing self-attention and multi-headed attention.
Natural Language Processing: Understanding Transformers and Tokenization
Provides an overview of Natural Language Processing, focusing on transformers, tokenization, and self-attention mechanisms for effective language analysis and synthesis.
Machine Translation: Attention Mechanism
Explores the attention mechanism in machine translation, addressing the bottleneck problem and improving NMT performance significantly.
Introduction to Modern Natural Language Processing
Introduces the course on Modern Natural Language Processing, covering its significance, applications, challenges, and advancements in technology.
Modern NLP: Introduction
By Antoine Bosselut introduces Natural Language Processing and its challenges, advancements in neural models, and course goals.
Sequence to Sequence Models: Overview and Applications
Covers sequence to sequence models, their architecture, applications, and the role of attention mechanisms in improving performance.
Multilingual NLP: Challenges and Innovations
Covers the importance of multilingual NLP and the challenges in scaling language models.
Classical Language Models: Foundations and Applications
Introduces classical language models, their applications, and foundational concepts like count-based modeling and evaluation metrics.
Transformers: Pretraining and Decoding Techniques
Covers advanced transformer concepts, focusing on pretraining and decoding techniques in NLP.