Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
From Stochastic Gradient Descent to Non-Smooth Optimization
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Stochastic Gradient Descent: Optimization Techniques
Explores Stochastic Gradient Descent with Averaging, comparing it with Gradient Descent, and discusses challenges in non-convex optimization and sparse recovery techniques.
Feature Engineering: Polynomial Regression
Covers fitting linear regression on features of the original predictors for flexible feature representation.
Linear Regression: Basics and Estimation
Covers the basics of linear regression and how to solve estimation problems using least squares and matrix notation.
Optimization Techniques: Gradient Method Overview
Discusses the gradient method for optimization, focusing on its application in machine learning and the conditions for convergence.
Linear Models: Continued
Explores linear models, logistic regression, gradient descent, and multi-class logistic regression with practical applications and examples.
Cross-Validation: Techniques and Applications
Explores cross-validation, overfitting, regularization, and regression techniques in machine learning.
Back to Linear Regression
Covers linear regression, regularization, inverse problems, X-ray tomography, image reconstruction, data inference, and detector intensity.
Gradient Descent: Optimization Techniques
Explores gradient descent, loss functions, and optimization techniques in neural network training.