Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Introduction to Learning by Stochastic Gradient Descent: Simple Perceptron
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Introduction to Supervised Learning: Classification and Perceptrons
Explores supervised learning through classification as a geometric problem and the concept of finding a separating surface.
Deep Learning
Covers the fundamentals of deep learning, including data representations, bag of words, data pre-processing, artificial neural networks, and convolutional neural networks.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Neural Networks: Perceptron Model and Backpropagation Algorithm
Covers the perceptron model and backpropagation algorithm in neural networks.
Introduction to Machine Learning: Supervised Learning
Introduces supervised learning, covering classification, regression, model optimization, overfitting, and kernel methods.
Deep Learning: Data Representation and Multilayer Perceptron
Covers data representation, MLP training, activation functions, and gradient-based learning in deep neural networks.
Gradient Descent on Two-Layer ReLU Neural Networks
Analyzes gradient descent on two-layer ReLU neural networks, exploring global convergence, regularization, implicit bias, and statistical efficiency.
Gradient-Based Algorithms in High-Dimensional Learning
Provides insights on gradient-based algorithms, deep learning mysteries, and the challenges of non-convex problems.
Kernel Methods: Neural Networks
Covers the fundamentals of neural networks, focusing on RBF kernels and SVM.
Multilayer Neural Networks: Deep Learning
Covers the fundamentals of multilayer neural networks and deep learning.