Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Recurrent Neural Networks: Neural Language Models
Graph Chatbot
Related lectures (32)
Previous
Page 2 of 4
Next
Sequence to Sequence Models: Overview and Applications
Covers sequence to sequence models, their architecture, applications, and the role of attention mechanisms in improving performance.
Deep Learning Building Blocks
Covers the fundamental building blocks of deep learning, including tensors, backpropagation, and PyTorch.
Building Physical Neural Networks
Discusses challenges in building physical neural networks, focusing on depth, connections, and trainability.
Language Modelling and Recurrent Neural Networks
Explores language modelling, RNNs, n-gram models, LSTMs, and bidirectional RNNs.
Recurrent Neural Networks: LSTMs & GRUs
Explores LSTM and GRU variants in recurrent neural networks, addressing challenges and advantages.
Deep Learning for NLP
Explores deep learning for NLP, covering word embeddings, context representations, learning techniques, and challenges like vanishing gradients and ethical considerations.
Deep Learning: Backward Propagation and Vanishing Gradient
Delves into backward propagation in deep learning, addressing the vanishing gradient challenge and the need for effective hidden units.
Neural Networks Optimization
Explores neural networks optimization, including backpropagation, batch normalization, weight initialization, and hyperparameter search strategies.
Deep Learning: Multilayer Perceptron and Training
Covers deep learning fundamentals, focusing on multilayer perceptrons and their training processes.
Vanishing Gradient Problem: Deep Learning
Discusses the vanishing gradient problem in deep neural networks and its solutions.