Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Gradient Descent: Optimization and Constraints
Graph Chatbot
Related lectures (27)
Previous
Page 1 of 3
Next
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Optimization with Constraints: KKT Conditions
Covers the KKT conditions for optimization with constraints, essential for solving constrained optimization problems efficiently.
Linear Programming: Weighted Bipartite Matching
Covers linear programming, weighted bipartite matching, and vertex cover problems in optimization.
Formulation, Problem Transformations
Explores transforming optimization problems to meet algorithm requirements and make them equivalent.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Linear Programming: Solving LPs
Covers the process of solving Linear Programs (LPs) using the simplex method.
Optimization with Constraints: KKT Conditions Explained
Covers the KKT conditions for optimization with constraints, detailing their application and significance in solving constrained problems.
Nonlinear Optimization
Covers line search, Newton's method, BFGS, and conjugate gradient in nonlinear optimization.
Optimization Techniques: Stochastic Gradient Descent and Beyond
Discusses optimization techniques in machine learning, focusing on stochastic gradient descent and its applications in constrained and non-convex problems.
Optimization with Constraints: KKT Conditions
Covers the optimization with constraints, focusing on the Karush-Kuhn-Tucker (KKT) conditions.