Lecture

Descent methods and line search: Finiteness of the line search algorithm

Description

This lecture covers the Wolfe conditions for line search algorithms, including the first and second conditions, along with the parameters involved. It explains the initialization of the line search algorithm and how to handle violations of the Wolfe conditions. The lecture delves into the properties of the algorithm and proves the finiteness of the line search parameter. By exploring the theorem on the finite number of iterations, it demonstrates that Wolfe 1 is verified for all parameters, leading to a contradiction if there were an infinite number of iterations.

In MOOCs (6)
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Optimization: principles and algorithms - Network and discrete optimization
Introduction to network optimization and discrete optimization
Optimization: principles and algorithms - Network and discrete optimization
Introduction to network optimization and discrete optimization
Optimization: principles and algorithms - Unconstrained nonlinear optimization
Introduction to unconstrained nonlinear optimization, Newton’s algorithms and descent methods.
Show more
Instructor
consectetur est
Incididunt amet veniam ipsum sunt. Quis ea consectetur incididunt esse voluptate Lorem nulla cillum nulla ea. Quis commodo aliquip ex commodo reprehenderit elit excepteur enim deserunt. Ad irure id veniam laboris consectetur ut. Est occaecat non nulla officia dolor Lorem. Minim proident qui nulla mollit adipisicing in et. Lorem id ullamco occaecat in.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (45)
Optimisation in Energy Systems
Explores optimization in energy system modeling, covering decision variables, objective functions, and different strategies with their pros and cons.
Quasi-newton optimization
Covers gradient line search methods and optimization techniques with an emphasis on Wolfe conditions and positive definiteness.
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Choosing a Step SizeMOOC: Introduction to optimization on smooth manifolds: first order methods
Explores choosing a step size in optimization on manifolds, including backtracking line-search and the Armijo method.
Nonlinear Optimization
Covers line search, Newton's method, BFGS, and conjugate gradient in nonlinear optimization.
Show more