**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Descent methods and line search: Second Wolfe condition

Description

This lecture delves into the second Wolfe condition, which aims to avoid steps that are too short along a descent direction. By requiring a sufficient increase in the directional derivative, this condition provides flexibility to the algorithm, allowing for different step sizes based on the parameter beta_2. Through illustrations and examples, the instructor explains how the second Wolfe condition characterizes the progress of the algorithm and influences the acceptance or rejection of steps, ultimately impacting the optimization process.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructor

In MOOCs (6)

Related concepts (15)

Introduction to linear optimization, duality and the simplex algorithm.

Introduction to linear optimization, duality and the simplex algorithm.

Introduction to network optimization and discrete optimization

Introduction to network optimization and discrete optimization

Introduction to unconstrained nonlinear optimization, Newton’s algorithms and descent methods.

A directional derivative is a concept in multivariable calculus that measures the rate at which a function changes in a particular direction at a given point. The directional derivative of a multivariable differentiable (scalar) function along a given vector v at a given point x intuitively represents the instantaneous rate of change of the function, moving through x with a velocity specified by v. The directional derivative of a scalar function f with respect to a vector v at a point (e.g.

In mathematics, the derivative shows the sensitivity of change of a function's output with respect to the input. Derivatives are a fundamental tool of calculus. For example, the derivative of the position of a moving object with respect to time is the object's velocity: this measures how quickly the position of the object changes when time advances. The derivative of a function of a single variable at a chosen input value, when it exists, is the slope of the tangent line to the graph of the function at that point.

In mathematics, the covariant derivative is a way of specifying a derivative along tangent vectors of a manifold. Alternatively, the covariant derivative is a way of introducing and working with a connection on a manifold by means of a differential operator, to be contrasted with the approach given by a principal connection on the frame bundle – see affine connection. In the special case of a manifold isometrically embedded into a higher-dimensional Euclidean space, the covariant derivative can be viewed as the orthogonal projection of the Euclidean directional derivative onto the manifold's tangent space.

In mathematics, the Fréchet derivative is a derivative defined on normed spaces. Named after Maurice Fréchet, it is commonly used to generalize the derivative of a real-valued function of a single real variable to the case of a vector-valued function of multiple real variables, and to define the functional derivative used widely in the calculus of variations. Generally, it extends the idea of the derivative from real-valued functions of one real variable to functions on normed spaces.

On a differentiable manifold, the exterior derivative extends the concept of the differential of a function to differential forms of higher degree. The exterior derivative was first described in its current form by Élie Cartan in 1899. The resulting calculus, known as exterior calculus, allows for a natural, metric-independent generalization of Stokes' theorem, Gauss's theorem, and Green's theorem from vector calculus.

Related lectures (34)

Descent methods and line search: First Wolfe conditionMOOC: Optimization: principles and algorithms - Linear optimization

Introduces the First Wolfe condition to ensure a proportional decrease in the objective function relative to the step length.

Optimization TechniquesMATH-106(en): Analysis II (English)

Covers optimization techniques and their application in solving real-world problems.

Directional Derivative: Definitions and ExamplesMATH-105(b): Advanced analysis II

Covers the definition of directional derivative and provides illustrative examples.

Implicit FunctionsMATH-431: Theory of stochastic calculus

Covers directional derivatives, implicit functions, and the concept of the maximum.

Objective function, Differentiability, The second orderMOOC: Optimization: principles and algorithms - Linear optimization

Explores the role of first and second derivatives in function curvature and convexity.