Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Why are there so many saddle points?: Loss landscape and optimization methods
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
The Hidden Convex Optimization Landscape of Deep Neural Networks
Explores the hidden convex optimization landscape of deep neural networks, showcasing the transition from non-convex to convex models.
Perception: Data-Driven Approaches
Explores perception in deep learning for autonomous vehicles, covering image classification, optimization methods, and the role of representation in machine learning.
Non-Convex Optimization: Techniques and Applications
Covers non-convex optimization techniques and their applications in machine learning.
Neural Networks Optimization
Explores neural networks optimization, including backpropagation, batch normalization, weight initialization, and hyperparameter search strategies.
Neural Networks: Deep Neural Networks
Explores the basics of neural networks, with a focus on deep neural networks and their architecture and training.
Understanding Machine Learning: Exactly Solvable Models
Explores the statistical mechanics of learning, focusing on neural networks' mysteries and computational challenges.
Structures in Non-Convex Optimization
Delves into structures in non-convex optimization, emphasizing scalable optimization for deep learning.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Neural Networks: Training and Optimization
Explores the training and optimization of neural networks, addressing challenges like non-convex loss functions and local minima.
Multilayer Neural Networks: Deep Learning
Covers the fundamentals of multilayer neural networks and deep learning.