Concept

Gradient method

Summary
In optimization, a gradient method is an algorithm to solve problems of the form :\min_{x\in\mathbb R^n}; f(x) with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient. See also
  • Gradient descent
  • Stochastic gradient descent
  • Coordinate descent
  • Frank–Wolfe algorithm
  • Landweber iteration
  • Random coordinate descent
  • Conjugate gradient method
  • Derivation of the conjugate gradient method
  • Nonlinear conjugate gradient method
  • Biconjugate gradient method
  • Biconjugate gradient stabilized method
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading