Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Question Answering: Deep Learning Insights
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Transformers: Revolutionizing Attention Mechanisms in NLP
Covers the development of transformers and their impact on attention mechanisms in NLP.
Word Embeddings: Introduction and Applications
Introduces word embeddings, explaining how they capture word meanings based on context and their applications in natural language processing tasks.
Transformers in Vision: Applications and Architectures
Covers the impact of transformers in computer vision, discussing their architecture, applications, and advancements in various tasks.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Binary Sentiment Classifier Training
Covers the training of a binary sentiment classifier using an RNN.
Coreference Resolution
Delves into coreference resolution, discussing challenges, advancements, and evaluation methods.
Classical Language Models: Foundations and Applications
Introduces classical language models, their applications, and foundational concepts like count-based modeling and evaluation metrics.
Scaling Language Models: Efficiency and Deployment
Covers the scaling of language models, focusing on training efficiency and deployment considerations.
Transformer Architecture: Subquadratic Attention Mechanisms
Covers transformer architecture, focusing on encoder-decoder models and subquadratic attention mechanisms for efficient processing of input sequences.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.