In
optimization
Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
, a descent direction is a vector
that, in the sense below, moves us closer towards a local minimum
of our objective function
.
Suppose we are computing
by an iterative method, such as
line search
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum \mathbf^* of an objective function f:\mathbb R^n\to\mathbb R. The other approach is trust region.
The line search approach first finds a ...
. We define a descent direction
at the
th iterate to be any
such that
, where
denotes the
inner product
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often ...
. The motivation for such an approach is that small steps along
guarantee that
is reduced, by
Taylor's theorem
In calculus, Taylor's theorem gives an approximation of a ''k''-times differentiable function around a given point by a polynomial of degree ''k'', called the ''k''th-order Taylor polynomial. For a smooth function, the Taylor polynomial is th ...
.
Using this definition, the negative of a non-zero gradient is always a
descent direction, as
.
Numerous methods exist to compute descent directions, all with differing merits. For example, one could use
gradient descent
In mathematics, gradient descent (also often called steepest descent) is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the ...
or the
conjugate gradient method.
More generally, if
is a
positive definite matrix, then
is a descent direction at
.
This generality is used in
preconditioned gradient descent methods.
See also
*
Directional derivative
In mathematics, the directional derivative of a multivariable differentiable (scalar) function along a given vector v at a given point x intuitively represents the instantaneous rate of change of the function, moving through x with a velocity ...
References
{{DEFAULTSORT:Descent Direction
Mathematical optimization