Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Generative pre-trained transformer
Graph Chatbot
Related lectures (31)
Login to filter by course
Login to filter by course
Reset
Previous
Page 1 of 4
Next
Pretraining: Transformers & Models
Explores pretraining models like BERT, T5, and GPT, discussing their training objectives and applications in natural language processing.
Custom GPTs for Academic Writing
Delves into the use of custom GPTs for academic writing, highlighting the balance between AI assistance and traditional learning methods.
Transformers: Pre-Training
Discusses challenges and advancements in Transformers, pretraining models, and subword tokenization in NLP.
Transformers: Pretraining and Decoding Techniques
Covers advanced transformer concepts, focusing on pretraining and decoding techniques in NLP.
Generative Models: Self-Attention and Transformers
Covers generative models with a focus on self-attention and transformers, discussing sampling methods and empirical means.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Contextual Representations: ELMo & BERT
Explores the development of contextualized embeddings in NLP, focusing on ELMo and BERT's advancements and impact on NLP tasks.
Coreference Resolution: Models and Evaluation
Explores coreference resolution models, challenges in scoring spans, graph refinement techniques, state-of-the-art results, and the impact of pretrained Transformers.
Contextual Representations: ELMO and BERT Overview
Covers contextual representations in NLP, focusing on ELMO and BERT architectures and their applications in various tasks.