Lecture

Descent methods and line search: Second Wolfe condition

Description

This lecture delves into the second Wolfe condition, which aims to avoid steps that are too short along a descent direction. By requiring a sufficient increase in the directional derivative, this condition provides flexibility to the algorithm, allowing for different step sizes based on the parameter beta_2. Through illustrations and examples, the instructor explains how the second Wolfe condition characterizes the progress of the algorithm and influences the acceptance or rejection of steps, ultimately impacting the optimization process.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.