**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Optimization methods

Description

This lecture covers optimization methods, focusing on gradient methods and line search techniques, with a special emphasis on the quadratic case. The instructor explains the concepts step by step, providing examples and discussing convergence criteria.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructor

In course

Related concepts (93)

MATH-351: Advanced numerical analysis

The student will learn state-of-the-art algorithms for solving differential equations. The analysis and implementation of these algorithms will be discussed in some detail.

Program optimization

In computer science, program optimization, code optimization, or software optimization, is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources. In general, a computer program may be optimized so that it executes more rapidly, or to make it capable of operating with less memory storage or other resources, or draw less power. Although the word "optimization" shares the same root as "optimal", it is rare for the process of optimization to produce a truly optimal system.

Quadratic form

In mathematics, a quadratic form is a polynomial with terms all of degree two ("form" is another name for a homogeneous polynomial). For example, is a quadratic form in the variables x and y. The coefficients usually belong to a fixed field K, such as the real or complex numbers, and one speaks of a quadratic form over K. If , and the quadratic form equals zero only when all variables are simultaneously zero, then it is a definite quadratic form; otherwise it is an isotropic quadratic form.

Combinatorial optimization

Combinatorial optimization is a subfield of mathematical optimization that consists of finding an optimal object from a finite set of objects, where the set of feasible solutions is discrete or can be reduced to a discrete set. Typical combinatorial optimization problems are the travelling salesman problem ("TSP"), the minimum spanning tree problem ("MST"), and the knapsack problem. In many such problems, such as the ones previously mentioned, exhaustive search is not tractable, and so specialized algorithms that quickly rule out large parts of the search space or approximation algorithms must be resorted to instead.

Quadratic function

In mathematics, a quadratic polynomial is a polynomial of degree two in one or more variables. A quadratic function is the polynomial function defined by a quadratic polynomial. Before the 20th century, the distinction was unclear between a polynomial and its associated polynomial function; so "quadratic polynomial" and "quadratic function" were almost synonymous. This is still the case in many elementary courses, where both terms are often abbreviated as "quadratic".

Stiff equation

In mathematics, a stiff equation is a differential equation for which certain numerical methods for solving the equation are numerically unstable, unless the step size is taken to be extremely small. It has proven difficult to formulate a precise definition of stiffness, but the main idea is that the equation includes some terms that can lead to rapid variation in the solution.

Related lectures (144)

Newton Method: Convergence and Quadratic CareMATH-351: Advanced numerical analysis

Covers the Newton method and its convergence properties near the optimal point.

Quasi-newton optimizationMATH-351: Advanced numerical analysis

Covers gradient line search methods and optimization techniques with an emphasis on Wolfe conditions and positive definiteness.

Conjugate Gradient OptimizationMATH-351: Advanced numerical analysis

Explores Conjugate Gradient optimization, covering quadratic and nonlinear cases, Wolfe conditions, BFGS, CG algorithms, and matrix symmetry.

Runge-Kutta MethodsMATH-351: Advanced numerical analysis

Explains the Runge-Kutta methods, particularly the explicit scheme of order 4 (ERK4), and how to optimize parameters for accuracy.

Optimization MethodsMATH-351: Advanced numerical analysis

Covers optimization methods without constraints, including gradient and line search in the quadratic case.