Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Gradient Descent Methods for Artificial Neural Networks
Graph Chatbot
Related lectures (31)
Previous
Page 2 of 4
Next
Multilayer Networks: First Steps
Covers the preparation for deriving the Backprop algorithm in layered networks using multi-layer perceptrons and gradient descent.
Neural Networks: Training and Optimization
Explores neural network training, optimization, and environmental considerations, with insights into PCA and K-means clustering.
Deep Learning: Multilayer Perceptron and Training
Covers deep learning fundamentals, focusing on multilayer perceptrons and their training processes.
Introduction to Machine Learning: Supervised Learning
Introduces supervised learning, covering classification, regression, model optimization, overfitting, and kernel methods.
Supervised Learning Overview
Covers CNNs, RNNs, SVMs, and supervised learning methods, emphasizing the importance of tuning regularization and making informed decisions in machine learning.
Neural Networks: Supervised Learning and Backpropagation
Explains neural networks, supervised learning, and backpropagation for training and improving performance.
Gradient Descent on Two-Layer ReLU Neural Networks
Analyzes gradient descent on two-layer ReLU neural networks, exploring global convergence, regularization, implicit bias, and statistical efficiency.
Reinforcement Learning Concepts
Covers key concepts in reinforcement learning, neural networks, clustering, and unsupervised learning, emphasizing their applications and challenges.
Introduction to Learning by Stochastic Gradient Descent: Simple Perceptron
Covers the derivation of the stochastic gradient descent formula for a simple perceptron and explores the geometric interpretation of classification.
Gradient Descent and Linear Regression
Covers stochastic gradient descent, linear regression, regularization, supervised learning, and the iterative nature of gradient descent.