Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimization in Machine Learning: Gradient Descent
Graph Chatbot
Related lectures (31)
Previous
Page 1 of 4
Next
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
Gradient Descent: Optimization Techniques
Explores gradient descent, loss functions, and optimization techniques in neural network training.
Deep Learning: Multilayer Perceptron and Training
Covers deep learning fundamentals, focusing on multilayer perceptrons and their training processes.
Neural Networks: Multilayer Perceptrons
Covers Multilayer Perceptrons, artificial neurons, activation functions, matrix notation, flexibility, regularization, regression, and classification tasks.
Linear and Logistic Regression
Introduces linear and logistic regression, covering parametric models, multi-output prediction, non-linearity, gradient descent, and classification applications.
Neural Networks: Multilayer Learning
Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Multilayer Neural Networks: Deep Learning
Covers the fundamentals of multilayer neural networks and deep learning.
Linear Regression and Logistic Regression
Covers linear and logistic regression for regression and classification tasks, focusing on loss functions and model training.
Gradient Descent and Linear Regression
Covers stochastic gradient descent, linear regression, regularization, supervised learning, and the iterative nature of gradient descent.
Neural Networks: Regression and Classification
Explores neural networks for regression and classification tasks, covering training, regularization, and practical examples.