Gradient method

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In optimization, gradient method is an algorithm to solve problems of the form

\min_{x\in\mathbb R^n}\; f(x)

with the search directions defined by the gradient of the function at the current point. Examples of gradient method are the gradient descent and the conjugate gradient.

See also[edit]