Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Understanding Generalization: Implicit Bias & Optimization
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Gradient Descent and Linear Regression
Covers stochastic gradient descent, linear regression, regularization, supervised learning, and the iterative nature of gradient descent.
Algorithms & Growth of Functions
Covers optimization algorithms, stable matching, and Big-O notation for algorithm efficiency.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Trust region methods: framework & algorithms
Covers trust region methods, focusing on the framework and algorithms.
Polynomial Regression and Gradient Descent
Covers polynomial regression, gradient descent, overfitting, underfitting, regularization, and feature scaling in optimization algorithms.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.
Adaptive Gradient Methods
Explores adaptive gradient methods like AdaGrad, AcceleGrad, and UniXGrad, focusing on their local adaptation and convergence rates.