Lecture

Descent methods and line search: Golden section

Description

This lecture covers the main idea behind descent methods and line search, focusing on the generation of intervals, symmetric reduction, and the calculation of the golden section. The instructor explains the assumptions, unidimensional functions, and the process of generating the next interval using different cases. The lecture also delves into the symmetric reduction technique and the recycling of function evaluations. The calculation of the golden section parameter 'p' is detailed, emphasizing the iterative process and the mathematical operations involved.

In MOOCs (6)
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Optimization: principles and algorithms - Linear optimization
Introduction to linear optimization, duality and the simplex algorithm.
Optimization: principles and algorithms - Network and discrete optimization
Introduction to network optimization and discrete optimization
Optimization: principles and algorithms - Network and discrete optimization
Introduction to network optimization and discrete optimization
Optimization: principles and algorithms - Unconstrained nonlinear optimization
Introduction to unconstrained nonlinear optimization, Newton’s algorithms and descent methods.
Show more
Instructor
exercitation ea tempor sint
Non voluptate qui sit culpa occaecat deserunt ullamco enim Lorem irure dolore ut laboris nulla. Nisi culpa ea ad consequat adipisicing nulla. Consequat labore amet proident cupidatat Lorem nulla sunt Lorem dolore dolor cillum cillum duis consequat. Consequat cillum mollit fugiat dolore veniam nisi laboris reprehenderit incididunt. Nulla dolor ipsum in adipisicing aliquip non ex labore incididunt eu tempor. Non dolor cupidatat nulla laboris occaecat.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (34)
Analysis of Trust Regions with Cauchy Steps
Explores the analysis of trust regions with Cauchy steps in optimization.
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Optimisation in Energy Systems
Explores optimization in energy system modeling, covering decision variables, objective functions, and different strategies with their pros and cons.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Gradient Descent: Optimization and Constraints
Discusses gradient descent for optimization with equality constraints and iterative convergence criteria.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.