Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Contextual Representations: ELMO and BERT Overview
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Neuro-symbolic Representations: Commonsense Knowledge & Reasoning
Explores neuro-symbolic representations for understanding commonsense knowledge and reasoning, emphasizing the challenges and limitations of deep learning in natural language processing.
BERT: Pretraining and Applications
Delves into BERT pretraining for transformers, discussing its applications in NLP tasks.
Transformers: Unifying Machine Learning Communities
Covers the role of Transformers in unifying various machine learning fields.
Language Modelling and Recurrent Neural Networks
Explores language modelling, RNNs, n-gram models, LSTMs, and bidirectional RNNs.
Coreference Resolution
Delves into coreference resolution, discussing challenges, advancements, and evaluation methods.
Deep Generative Models: Part 2
Explores deep generative models, including mixtures of multinomials, PCA, deep autoencoders, convolutional autoencoders, and GANs.
Modern NLP and Ethics in NLP
Delves into advancements and challenges in NLP, along with ethical considerations and potential harms.
Long Short-Term Memory Networks
Introduces Long Short-Term Memory (LSTM) networks as a solution to vanishing and exploding gradients in recurrent neural networks.
Deep Learning: Graphs and Transformers Overview
Covers deep learning concepts, focusing on graphs, transformers, and their applications in multimodal data processing.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.