Skip to main content
Graph
Search
fr
|
en
Switch to dark mode
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Neural Word Embeddings
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Natural Language Processing: A Primer
Introduces Natural Language Processing (NLP) and its applications, covering tokenization, machine learning, sentiment analysis, and Swiss NLP applications.
Document Retrieval and Classification
Covers document retrieval, classification, sentiment analysis, and topic detection using TF-IDF matrices and contextualized word vectors like BERT.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Text Handling: Matrix, Documents, Topics
Explores text handling, focusing on matrices, documents, and topics, including challenges in document classification and advanced models like BERT.
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Word Embeddings: Glove and Semantic Relationships
Explores word embeddings, Glove model, semantic relationships, subword embeddings, and syntactic relationships.
Pretraining: Transformers & Models
Explores pretraining models like BERT, T5, and GPT, discussing their training objectives and applications in natural language processing.
Word Embeddings: Models and Learning
Explores word embeddings, context importance, and learning algorithms for creating new representations.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Contextual Representations: ELMO and BERT Overview
Covers contextual representations in NLP, focusing on ELMO and BERT architectures and their applications in various tasks.