Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Trust region methods: framework & algorithms
Graph Chatbot
Related lectures (24)
Previous
Page 1 of 3
Next
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Nonlinear Optimization
Covers line search, Newton's method, BFGS, and conjugate gradient in nonlinear optimization.
Descent methods and line search: Finiteness of the line search algorithm
Explores the Wolfe conditions for line search algorithms and proves the finiteness of the line search parameter.
Convex Optimization: Gradient Algorithms
Covers convex optimization problems and gradient-based algorithms to find the global minimum.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
TR global convergence (end) + CG
Covers the trust-region method and introduces the truncated conjugate gradients method.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.