Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
BERT: Pretraining and Applications
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Deep Learning for Question Answering
Explores deep learning for question answering, analyzing neural networks and model robustness to noise.
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Ethics in NLP
Discusses the ethical implications of NLP systems, focusing on biases, toxicity, and privacy concerns in language models.
Coreference Resolution
Delves into coreference resolution, discussing challenges, advancements, and evaluation methods.
Model Compression: Techniques for Efficient NLP Models
Explores model compression techniques in NLP, discussing pruning, quantization, weight factorization, knowledge distillation, and attention mechanisms.
Vision-Language-Action Models: Training and Applications
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Modern NLP: From GPT to ChatGPT
Explores the evolution of modern NLP from GPT-2 to GPT-3, emphasizing in-context learning and the development of ChatGPT.
Question Answering: Deep Learning Insights
Explores question answering systems, reading comprehension models, and the challenges in achieving accurate responses.
Contextual Representations: ELMo & BERT
Explores the development of contextualized embeddings in NLP, focusing on ELMo and BERT's advancements and impact on NLP tasks.
Transformers: Pre-Training
Discusses challenges and advancements in Transformers, pretraining models, and subword tokenization in NLP.