In
optimization
Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
, a gradient method is an
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
to solve problems of the form
:
with the search directions defined by the
gradient of the function at the current point. Examples of gradient methods are the
gradient descent and the
conjugate gradient.
See also
*
Gradient descent
*
Stochastic gradient descent
*
Coordinate descent
*
Frank–Wolfe algorithm
*
Landweber iteration
*
Random coordinate descent
*
Conjugate gradient method
*
Derivation of the conjugate gradient method
*
Nonlinear conjugate gradient method
*
Biconjugate gradient method
*
Biconjugate gradient stabilized method
References
*
First order methods
Optimization algorithms and methods
Numerical linear algebra
{{linear-algebra-stub