Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimization Basics: Unconstrained Optimization and Gradient Descent
Graph Chatbot
Related lectures (27)
Previous
Page 2 of 3
Next
Optimization: Gradient Descent and Subgradients
Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
Newton's Method: Optimization & Indefiniteness
Covers Newton's Method for optimization and discusses the caveats of indefiniteness in optimization problems.
Optimization Techniques: Gradient Descent and Convex Functions
Provides an overview of optimization techniques, focusing on gradient descent and properties of convex functions in machine learning.
Optimization with Constraints: KKT Conditions
Covers the KKT conditions for optimization with constraints, essential for solving constrained optimization problems efficiently.
Stochastic Optimization: Algorithms and Methods
Explores stochastic optimization algorithms and methods for convex problems with smooth and nonsmooth risks.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Faster and Projected Gradient Descent: Optimization Techniques
Discusses advanced optimization techniques, focusing on faster and projected gradient descent methods in machine learning.
Gradient Descent: Optimization and Constraints
Discusses gradient descent for optimization with equality constraints and iterative convergence criteria.
Line Search: Optimization Basics
Covers the basics of optimization in geometric computing, focusing on finding the best modification efficiently.