Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Word Embeddings: Introduction and Applications
Graph Chatbot
Related lectures (31)
Previous
Page 2 of 4
Next
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Deep Learning for Question Answering
Explores deep learning for question answering, analyzing neural networks and model robustness to noise.
Introduction to Natural Language Processing
Introduces the basics of Natural Language Processing, covering challenges, application domains, and linguistic processing levels.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Deep Learning for NLP
Explores deep learning for NLP, covering word embeddings, context representations, learning techniques, and challenges like vanishing gradients and ethical considerations.
Prompting and Alignment
Explores prompting, alignment, and the capabilities of large language models for natural language processing tasks.
Document Retrieval and Classification
Covers document retrieval, classification, sentiment analysis, and topic detection using TF-IDF matrices and contextualized word vectors like BERT.
Text Handling: Matrix, Documents, Topics
Explores text handling, focusing on matrices, documents, and topics, including challenges in document classification and advanced models like BERT.
Contextual Representations: ELMO and BERT Overview
Covers contextual representations in NLP, focusing on ELMO and BERT architectures and their applications in various tasks.
Introduction to NLP and the Course
Covers the basics of Natural Language Processing, including challenges, linguistic processing levels, and the impact of power laws.