Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Neural Word Embeddings: Learning Representations for Natural Language
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Natural Language Processing: A Primer
Introduces Natural Language Processing (NLP) and its applications, covering tokenization, machine learning, sentiment analysis, and Swiss NLP applications.
Word Embeddings: Introduction and Applications
Introduces word embeddings, explaining how they capture word meanings based on context and their applications in natural language processing tasks.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Contextual Representations: ELMO and BERT Overview
Covers contextual representations in NLP, focusing on ELMO and BERT architectures and their applications in various tasks.
Deep Learning: Principles and Applications
Covers the fundamentals of deep learning, including data, architecture, and ethical considerations in model deployment.
Compositional Representations and Systematic Generalization
Examines systematicity, compositionality, neural network challenges, and unsupervised learning in NLP.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Deep Learning: Graphs and Transformers Overview
Covers deep learning concepts, focusing on graphs, transformers, and their applications in multimodal data processing.
Deep Learning for Question Answering
Explores deep learning for question answering, analyzing neural networks and model robustness to noise.
Machine Learning: Supervised and Unsupervised Learning Techniques
Covers supervised and unsupervised learning techniques in machine learning, highlighting their applications in finance and environmental analysis.