Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimization Basics: Unconstrained Optimization and Gradient Descent
Graph Chatbot
Related lectures (27)
Previous
Page 3 of 3
Next
Optimization Basics
Introduces optimization basics, covering logistic regression, derivatives, convex functions, gradient descent, and second-order methods.
Mathematics of Data: Computation Role
Explores the role of computation in data mathematics, focusing on iterative methods, optimization, estimators, and descent principles.
Convergence Criteria: Necessary Conditions
Explains necessary conditions for convergence in optimization problems.
Optimality of Convergence Rates: Accelerated/Stochastic Gradient Descent
Covers the optimality of convergence rates in accelerated and stochastic gradient descent methods for non-convex optimization problems.
Faster Gradient Descent: Projected Optimization Techniques
Covers faster gradient descent methods and projected gradient descent for constrained optimization in machine learning.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.