In
mathematics, Moreau's theorem is a result in
convex analysis
Convex analysis is the branch of mathematics devoted to the study of properties of convex functions and convex sets, often with applications in convex minimization, a subdomain of optimization theory.
Convex sets
A subset C \subseteq X of ...
named after French mathematician
Jean-Jacques Moreau. It shows that sufficiently
well-behaved
In mathematics, when a mathematical phenomenon runs counter to some intuition, then the phenomenon is sometimes called pathological. On the other hand, if a phenomenon does not run counter to intuition,
it is sometimes called well-behaved. T ...
convex function
In mathematics, a real-valued function is called convex if the line segment between any two points on the graph of the function lies above the graph between the two points. Equivalently, a function is convex if its epigraph (the set of poin ...
als on
Hilbert space
In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natu ...
s are differentiable and the derivative is well-approximated by the so-called
Yosida approximation, which is defined in terms of the
resolvent operator.
Statement of the theorem
Let ''H'' be a Hilbert space and let ''φ'' : ''H'' → R ∪ be a
proper, convex and
lower semi-continuous
In mathematical analysis, semicontinuity (or semi-continuity) is a property of extended real-valued functions that is weaker than continuity. An extended real-valued function f is upper (respectively, lower) semicontinuous at a point x_0 if, r ...
extended real-valued functional on ''H''. Let ''A'' stand for ∂''φ'', the
subderivative
In mathematics, the subderivative, subgradient, and subdifferential generalize the derivative to convex functions which are not necessarily differentiable. Subderivatives arise in convex analysis, the study of convex functions, often in connection ...
of ''φ''; for ''α'' > 0 let ''J''
''α'' denote the resolvent:
:
and let ''A''
''α'' denote the Yosida approximation to ''A'':
:
For each ''α'' > 0 and ''x'' ∈ ''H'', let
:
Then
:
and ''φ''
''α'' is convex and
Fréchet differentiable with derivative d''φ''
''α'' = ''A''
''α''. Also, for each ''x'' ∈ ''H'' (pointwise), ''φ''
''α''(''x'') converges upwards to ''φ''(''x'') as ''α'' → 0.
References
* (Proposition IV.1.8)
{{Functional analysis
Convex analysis
Theorems in functional analysis