Proximal Gradient Methods
   HOME

TheInfoList



OR:

Proximal gradient methods are a generalized form of projection used to solve non-differentiable
convex optimization Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems ...
problems. Many interesting problems can be formulated as convex optimization problems of the form \operatorname\limits_ \sum_^n f_i(x) where f_i: \mathbb^N \rightarrow \mathbb,\ i = 1, \dots, n are possibly non-differentiable convex functions. The lack of differentiability rules out conventional smooth optimization techniques like the steepest descent method and the
conjugate gradient method In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-semidefinite. The conjugate gradient method is often implemented as an it ...
, but proximal gradient methods can be used instead. Proximal gradient methods starts by a splitting step, in which the functions f_1, . . . , f_n are used individually so as to yield an easily implementable algorithm. They are called proximal because each non-differentiable function among f_1, . . . , f_n is involved via its proximity operator. Iterative shrinkage thresholding algorithm, projected Landweber, projected gradient, alternating projections, alternating-direction method of multipliers, alternating split Bregman are special instances of proximal algorithms.Details of proximal methods are discussed in For the theory of proximal gradient methods from the perspective of and with applications to
statistical learning theory Statistical learning theory is a framework for machine learning drawing from the fields of statistics and functional analysis. Statistical learning theory deals with the statistical inference problem of finding a predictive function based on da ...
, see
proximal gradient methods for learning Proximal gradient (forward backward splitting) methods for learning is an area of research in optimization and statistical learning theory which studies algorithms for a general class of Convex function#Definition, convex Regularization (mathematics ...
.


Projection onto convex sets (POCS)

One of the widely used convex optimization algorithms is projections onto convex sets (POCS). This algorithm is employed to recover/synthesize a signal satisfying simultaneously several convex constraints. Let f_i be the indicator function of non-empty closed convex set C_i modeling a constraint. This reduces to convex feasibility problem, which require us to find a solution such that it lies in the intersection of all convex sets C_i. In POCS method each set C_i is incorporated by its projection operator P_. So in each
iteration Iteration is the repetition of a process in order to generate a (possibly unbounded) sequence of outcomes. Each repetition of the process is a single iteration, and the outcome of each iteration is then the starting point of the next iteration. ...
x is updated as : x_ = P_ P_ \cdots P_ x_k However beyond such problems projection operators are not appropriate and more general operators are required to tackle them. Among the various generalizations of the notion of a convex projection operator that exist, proximal operators are best suited for other purposes.


Examples

Special instances of Proximal Gradient Methods are * Projected Landweber * Alternating projection * Alternating-direction method of multipliers


See also

* Proximal operator *
Proximal gradient methods for learning Proximal gradient (forward backward splitting) methods for learning is an area of research in optimization and statistical learning theory which studies algorithms for a general class of Convex function#Definition, convex Regularization (mathematics ...
* Frank–Wolfe algorithm


Notes


References

* *{{ cite book , last1 = Combettes , first1 = Patrick L. , last2 = Pesquet , first2 = Jean-Christophe , title = Fixed-Point Algorithms for Inverse Problems in Science and Engineering , volume = 49 , year = 2011 , pages = 185–212


External links

* Stephen Boyd and Lieven Vandenberghe Book
''Convex optimization''

EE364a: Convex Optimization I
an
EE364b: Convex Optimization II
Stanford course homepages
EE227A: Lieven Vandenberghe Notes
Lecture 18
ProximalOperators.jl
a Julia package implementing proximal operators.
ProximalAlgorithms.jl
a Julia package implementing algorithms based on the proximal operator, including the proximal gradient method.
Proximity Operator repository
a collection of proximity operators implemented in
Matlab MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementat ...
and Python. Gradient methods