Jump to content

Gradient method

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Sumanuil (talk | contribs) at 05:36, 17 April 2022. The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In optimization, a gradient method is an algorithm to solve problems of the form

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.

See also

[edit]

References

[edit]
  • Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2.