Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Transformer: Pre-Training
Graph Chatbot
Related lectures (31)
Previous
Page 2 of 4
Next
Pretraining: Transformers & Models
Explores pretraining models like BERT, T5, and GPT, discussing their training objectives and applications in natural language processing.
Neural Machine Translation
Explores the evolution and challenges of Neural Machine Translation systems and the evaluation metrics used in this field.
Machine Translation: Sequence-to-Sequence and Attention
Explores the advancements in Machine Translation, focusing on Sequence-to-Sequence models and Attention mechanisms.
Deep Learning for NLP
Explores deep learning for NLP, covering word embeddings, context representations, learning techniques, and challenges like vanishing gradients and ethical considerations.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Contextual Representations: ELMO and BERT Overview
Covers contextual representations in NLP, focusing on ELMO and BERT architectures and their applications in various tasks.
Transformers: Overview and Self-Attention
Provides an overview of Transformers, self-attention, multi-headed attention, and the Transformer decoder and encoder.
BERT: Pretraining and Applications
Delves into BERT pretraining for transformers, discussing its applications in NLP tasks.
Transformers in Vision: Applications and Architectures
Covers the impact of transformers in computer vision, discussing their architecture, applications, and advancements in various tasks.
Modern NLP and Ethics in NLP
Delves into advancements and challenges in NLP, along with ethical considerations and potential harms.