Subgradient methods are
convex optimization
Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems ...
methods which use
subderivatives. Originally developed by
Naum Z. Shor and others in the 1960s and 1970s, subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of
gradient descent.
Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
In recent years, some
interior-point methods have been suggested for convex minimization problems, but subgradient projection methods and related bundle methods of descent remain competitive. For convex minimization problems with very large number of dimensions, subgradient-projection methods are suitable, because they require little storage.
Subgradient projection methods are often applied to large-scale problems with decomposition techniques. Such decomposition methods often allow a simple distributed method for a problem.
Classical subgradient rules
Let
be a
convex function
In mathematics, a real-valued function is called convex if the line segment between any two distinct points on the graph of a function, graph of the function lies above or on the graph between the two points. Equivalently, a function is conve ...
with domain
A classical subgradient method iterates
where
denotes ''any''
subgradient of
at
and
is the
iterate of
If
is differentiable, then its only subgradient is the gradient vector
itself.
It may happen that
is not a descent direction for
at
We therefore maintain a list
that keeps track of the lowest objective function value found so far, i.e.
Step size rules
Many different types of step-size rules are used by subgradient methods. This article notes five classical step-size rules for which convergence
proofs are known:
*Constant step size,
*Constant step length,
which gives
*Square summable but not summable step size, i.e. any step sizes satisfying
*Nonsummable diminishing, i.e. any step sizes satisfying
*Nonsummable diminishing step lengths, i.e.
where
For all five rules, the step-sizes are determined "off-line", before the method is iterated; the step-sizes do not depend on preceding iterations. This "off-line" property of subgradient methods differs from the "on-line" step-size rules used for descent methods for differentiable functions: Many methods for minimizing differentiable functions satisfy Wolfe's sufficient conditions for convergence, where step-sizes typically depend on the current point and the current search-direction. An extensive discussion of stepsize rules for subgradient methods, including incremental versions, is given in the books by Bertsekas and by Bertsekas, Nedic, and Ozdaglar.
Convergence results
For constant step-length and scaled subgradients having
Euclidean norm
Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are ''Euclidean spaces'' ...
equal to one, the subgradient method converges to an arbitrarily close approximation to the minimum value, that is
:
by a result of
Shor.
These classical subgradient methods have poor performance and are no longer recommended for general use.
However, they are still used widely in specialized applications because they are simple and they can be easily adapted to take advantage of the special structure of the problem at hand.
Subgradient-projection and bundle methods
During the 1970s,
Claude Lemaréchal and Phil Wolfe proposed "bundle methods" of descent for problems of convex minimization. The meaning of the term "bundle methods" has changed significantly since that time. Modern versions and full convergence analysis were provided by Kiwiel.
[
] Contemporary bundle-methods often use "
level
Level or levels may refer to:
Engineering
*Level (optical instrument), a device used to measure true horizontal or relative heights
* Spirit level or bubble level, an instrument designed to indicate whether a surface is horizontal or vertical
*C ...
control" rules for choosing step-sizes, developing techniques from the "subgradient-projection" method of Boris T. Polyak (1969). However, there are problems on which bundle methods offer little advantage over subgradient-projection methods.
[
][
]
Constrained optimization
Projected subgradient
One extension of the subgradient method is the projected subgradient method, which solves the constrained
optimization
Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
problem
:minimize
subject to
where
is a
convex set
In geometry, a set of points is convex if it contains every line segment between two points in the set.
For example, a solid cube (geometry), cube is a convex set, but anything that is hollow or has an indent, for example, a crescent shape, is n ...
.
The projected subgradient method uses the iteration
where
is projection on
and
is any subgradient of
at
General constraints
The subgradient method can be extended to solve the inequality constrained problem
:minimize
subject to
where
are convex. The algorithm takes the same form as the unconstrained case
where
is a step size, and
is a subgradient of the objective or one of the constraint functions at
Take
where
denotes the
subdifferential of
If the current point is feasible, the algorithm uses an objective subgradient; if the current point is infeasible, the algorithm chooses a subgradient of any violated constraint.
See also
*
References
Further reading
*
*
*
*
*
External links
EE364Aan
EE364B Stanford's convex optimization course sequence.
{{optimization algorithms, convex
Convex analysis
Convex optimization
Mathematical optimization
Optimization algorithms and methods