In
mathematical optimization, the Rosenbrock function is a non-
convex function, introduced by
Howard H. Rosenbrock in 1960, which is used as a
performance test problem for optimization
algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing ...
s. It is also known as Rosenbrock's valley or Rosenbrock's banana function.
The global minimum is inside a long, narrow,
parabolic shaped flat valley. To find the valley is trivial. To converge to the global
minimum
In mathematical analysis, the maxima and minima (the respective plurals of maximum and minimum) of a function, known collectively as extrema (the plural of extremum), are the largest and smallest value of the function, either within a given r ...
, however, is difficult.
The function is defined by
It has a global minimum at
, where
. Usually, these parameters are set such that
and
. Only in the trivial case where
the function is symmetric and the minimum is at the origin.
Multidimensional generalisations
Two variants are commonly encountered.
One is the sum of
uncoupled 2D Rosenbrock problems, and is defined only for even
s:
:
This variant has predictably simple solutions.
A second, more involved variant is
:
has exactly one minimum for
(at
) and exactly two minima for
—the global minimum at
and a local minimum near
. This result is obtained by setting the gradient of the function equal to zero, noticing that the resulting equation is a rational function of
. For small
the polynomials can be determined exactly and
Sturm's theorem can be used to determine the number of real roots, while the roots can be
bounded in the region of
.
For larger
this method breaks down due to the size of the coefficients involved.
Stationary points
Many of the stationary points of the function exhibit a regular pattern when plotted.
This structure can be exploited to locate them.
Optimization examples
The Rosenbrock function can be efficiently optimized by adapting appropriate coordinate system without using any
gradient information and without building local approximation models (in contrast to many derivate-free optimizers). The following figure illustrates an example of 2-dimensional Rosenbrock function optimization by
adaptive coordinate descent Adaptive coordinate descent is an improvement of the coordinate descent algorithm to non-separable optimization by the use of adaptive encoding. The adaptive coordinate descent approach gradually builds a transformation of the coordinate system suc ...
from starting point
. The solution with the function value
can be found after 325 function evaluations.
Using the
Nelder–Mead method
The Nelder–Mead method (also downhill simplex method, amoeba method, or polytope method) is a numerical method used to find the minimum or maximum of an objective function in a multidimensional space. It is a direct search method (based on ...
from starting point
with a regular initial simplex a minimum is found with function value
after 185 function evaluations. The figure below visualizes the evolution of the algorithm.
See also
*
Test functions for optimization
In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as:
* Convergence rate.
* Precision.
* Robustness.
* General performance.
Here some test functions are ...
References
External links
Rosenbrock function plot in 3D* {{MathWorld , title=Rosenbrock Function , urlname=RosenbrockFunction
Mathematical optimization
Polynomials
Functions and mappings