Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Natural Language Processing: Understanding Transformers and Tokenization
Graph Chatbot
Related lectures (32)
Previous
Page 1 of 4
Next
Deep Learning for NLP
Introduces deep learning concepts for NLP, covering word embeddings, RNNs, and Transformers, emphasizing self-attention and multi-headed attention.
Sequence to Sequence Models: Overview and Applications
Covers sequence to sequence models, their architecture, applications, and the role of attention mechanisms in improving performance.
Neural Networks for NLP
Covers modern Neural Network approaches to NLP, focusing on word embeddings, Neural Networks for NLP tasks, and future Transfer Learning techniques.
Transformers: Pretraining and Decoding Techniques
Covers advanced transformer concepts, focusing on pretraining and decoding techniques in NLP.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Deep Learning for NLP
Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Transformers: Revolutionizing Attention Mechanisms in NLP
Covers the development of transformers and their impact on attention mechanisms in NLP.
Introduction to Modern Natural Language Processing
Introduces the course on Modern Natural Language Processing, covering its significance, applications, challenges, and advancements in technology.
Binary Sentiment Classifier Training
Covers the training of a binary sentiment classifier using an RNN.
Neural Word Embeddings: Learning Representations for Natural Language
Covers neural word embeddings and methods for learning word representations in natural language processing.