Proximal gradient methods are a generalized form of projection used to solve non-differentiable
convex optimization
Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization prob ...
problems.
Many interesting problems can be formulated as convex optimization problems of the form
where
are possibly non-differentiable
convex functions. The lack of differentiability rules out conventional smooth optimization techniques like the
steepest descent method and the
conjugate gradient method
In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-definite. The conjugate gradient method is often implemented as an iter ...
, but proximal gradient methods can be used instead.
Proximal gradient methods starts by a splitting step, in which the functions
are used individually so as to yield an easily
implementable algorithm. They are called
proximal
Standard anatomical terms of location are used to unambiguously describe the anatomy of animals, including humans. The terms, typically derived from Latin or Greek roots, describe something in its standard anatomical position. This position ...
because each non-differentiable function among
is involved via its
proximity operator. Iterative shrinkage thresholding algorithm,
projected Landweber, projected gradient,
alternating projections,
alternating-direction method of multipliers, alternating
split
Bregman are special instances of proximal algorithms.
[Details of proximal methods are discussed in ]
For the theory of proximal gradient methods from the perspective of and with applications to
statistical learning theory
Statistical learning theory is a framework for machine learning drawing from the fields of statistics and functional analysis. Statistical learning theory deals with the statistical inference problem of finding a predictive function based on d ...
, see
proximal gradient methods for learning.
Projection onto convex sets (POCS)
One of the widely used convex optimization algorithms is
projections onto convex sets (POCS). This algorithm is employed to recover/synthesize a signal satisfying simultaneously several convex constraints. Let
be the indicator function of non-empty closed convex set
modeling a constraint. This reduces to convex feasibility problem, which require us to find a solution such that it lies in the intersection of all convex sets
. In POCS method each set
is incorporated by its
projection operator
In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself (an endomorphism) such that P\circ P=P. That is, whenever P is applied twice to any vector, it gives the same result as if it wer ...
. So in each
iteration
Iteration is the repetition of a process in order to generate a (possibly unbounded) sequence of outcomes. Each repetition of the process is a single iteration, and the outcome of each iteration is then the starting point of the next iteration. ...
is updated as
:
However beyond such problems
projection operator
In linear algebra and functional analysis, a projection is a linear transformation P from a vector space to itself (an endomorphism) such that P\circ P=P. That is, whenever P is applied twice to any vector, it gives the same result as if it wer ...
s are not appropriate and more general operators are required to tackle them. Among the various generalizations of the notion of a convex projection operator that exist, proximity operators are best suited for other purposes.
Examples
Special instances of Proximal Gradient Methods are
*
Projected Landweber
*
Alternating projection
*
Alternating-direction method of multipliers
See also
*
Proximal operator
*
Proximal gradient methods for learning
*
Frank–Wolfe algorithm
The Frank–Wolfe algorithm is an iterative first-order optimization algorithm for constrained convex optimization. Also known as the conditional gradient method, reduced gradient algorithm and the convex combination algorithm, the method was ori ...
Notes
References
*
*{{ cite book
, last1 = Combettes
, first1 = Patrick L.
, last2 = Pesquet
, first2 = Jean-Christophe
, title = Fixed-Point Algorithms for Inverse Problems in Science and Engineering
, volume = 49
, year = 2011
, pages = 185–212
External links
* Stephen Boyd and Lieven Vandenberghe Book
''Convex optimization''EE364a: Convex Optimization Ian
EE364b: Convex Optimization II Stanford course homepages
EE227A: Lieven Vandenberghe NotesLecture 18
ProximalOperators.jl a
Julia
Julia is usually a feminine given name. It is a Latinate feminine form of the name Julio and Julius. (For further details on etymology, see the Wiktionary entry "Julius".) The given name ''Julia'' had been in use throughout Late Antiquity (e ...
package implementing proximal operators.
ProximalAlgorithms.jl a
Julia
Julia is usually a feminine given name. It is a Latinate feminine form of the name Julio and Julius. (For further details on etymology, see the Wiktionary entry "Julius".) The given name ''Julia'' had been in use throughout Late Antiquity (e ...
package implementing algorithms based on the proximal operator, including the proximal gradient method.
Proximity Operator repository a collection of proximity operators implemented in
Matlab
MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementa ...
and
Python.
Gradient methods