Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Deep and Convolutional Networks: Generalization and Optimization
Graph Chatbot
Related lectures (30)
Previous
Page 3 of 3
Next
Nonlinear Supervised Learning
Explores the inductive bias of different nonlinear supervised learning methods and the challenges of hyper-parameter tuning.
Deep Learning: Data Representations and Neural Networks
Covers data representations, Bag of Words, histograms, data pre-processing, and neural networks.
Gradient Descent on Two-Layer ReLU Neural Networks
Analyzes gradient descent on two-layer ReLU neural networks, exploring global convergence, regularization, implicit bias, and statistical efficiency.
Structures in Non-Convex Optimization
Covers non-convex optimization, deep learning training problems, stochastic gradient descent, adaptive methods, and neural network architectures.
Multilayer Perceptron: Training and Optimization
Explores the multilayer perceptron model, training, optimization, data preprocessing, activation functions, backpropagation, and regularization.
Deep Learning Fundamentals
Introduces deep learning, from logistic regression to neural networks, emphasizing the need for handling non-linearly separable data.
Deep Splines: Unifying Framework for Deep Neural Networks
Introduces a functional framework for deep neural networks with adaptive piecewise-linear splines, focusing on biomedical image reconstruction and the challenges of deep splines.
Neural Network Approximation and Learning
Delves into neural network approximation, supervised learning, challenges in high-dimensional learning, and deep learning experimental revolution.
Double Descent Curves: Overparametrization
Explores double descent curves and overparametrization in machine learning models, highlighting the risks and benefits.
Deep Learning: Multilayer Perceptron and Training
Covers deep learning fundamentals, focusing on multilayer perceptrons and their training processes.