HOME

TheInfoList



OR:

In
mathematical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
, the Rosenbrock function is a non-
convex function In mathematics, a real-valued function is called convex if the line segment between any two points on the graph of a function, graph of the function lies above the graph between the two points. Equivalently, a function is convex if its epigra ...
, introduced by Howard H. Rosenbrock in 1960, which is used as a performance test problem for optimization algorithms. It is also known as Rosenbrock's valley or Rosenbrock's banana function. The global minimum is inside a long, narrow, parabolic shaped flat valley. To find the valley is trivial. To converge to the global minimum, however, is difficult. The function is defined by f(x, y) = (a-x)^2 + b(y-x^2)^2 It has a global minimum at (x, y)=(a, a^2), where f(x, y)=0. Usually, these parameters are set such that a = 1 and b = 100. Only in the trivial case where a=0 the function is symmetric and the minimum is at the origin.


Multidimensional generalisations

Two variants are commonly encountered. One is the sum of N/2 uncoupled 2D Rosenbrock problems, and is defined only for even Ns: : f(\mathbf) = f(x_1, x_2, \dots, x_N) = \sum_^ \left 00(x_^2 - x_)^2 + (x_ - 1)^2 \right This variant has predictably simple solutions. A second, more involved variant is : f(\mathbf) = \sum_^ 00 (x_ - x_i^2 )^2 + (1-x_i)^2\quad \mbox \quad \mathbf = (x_1, \ldots, x_N) \in \mathbb^N. has exactly one minimum for N=3 (at (1, 1, 1)) and exactly two minima for 4 \le N \le 7—the global minimum at (1, 1, ..., 1) and a local minimum near \hat = (-1, 1, \dots, 1). This result is obtained by setting the gradient of the function equal to zero, noticing that the resulting equation is a rational function of x. For small N the polynomials can be determined exactly and
Sturm's theorem In mathematics, the Sturm sequence of a univariate polynomial is a sequence of polynomials associated with and its derivative by a variant of Euclid's algorithm for polynomials. Sturm's theorem expresses the number of distinct real roots of loca ...
can be used to determine the number of real roots, while the roots can be
bounded Boundedness or bounded may refer to: Economics * Bounded rationality, the idea that human rationality in decision-making is bounded by the available information, the cognitive limitations, and the time available to make the decision * Bounded e ...
in the region of , x_i, < 2.4. For larger N this method breaks down due to the size of the coefficients involved.


Stationary points

Many of the stationary points of the function exhibit a regular pattern when plotted. This structure can be exploited to locate them.


Optimization examples

The Rosenbrock function can be efficiently optimized by adapting appropriate coordinate system without using any gradient information and without building local approximation models (in contrast to many derivate-free optimizers). The following figure illustrates an example of 2-dimensional Rosenbrock function optimization by adaptive coordinate descent from starting point x_0=(-3,-4). The solution with the function value 10^ can be found after 325 function evaluations. Using the Nelder–Mead method from starting point x_0=(-1,1) with a regular initial simplex a minimum is found with function value 1.36 \cdot 10^ after 185 function evaluations. The figure below visualizes the evolution of the algorithm.


See also

* Test functions for optimization


References


External links


Rosenbrock function plot in 3D
* {{MathWorld , title=Rosenbrock Function , urlname=RosenbrockFunction Mathematical optimization Polynomials Functions and mappings