derivative-free optimization
   HOME

TheInfoList



OR:

Derivative-free optimization (sometimes referred to as blackbox optimization) is a discipline in
mathematical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
that does not use
derivative In mathematics, the derivative is a fundamental tool that quantifies the sensitivity to change of a function's output with respect to its input. The derivative of a function of a single variable at a chosen input value, when it exists, is t ...
information in the classical sense to find optimal solutions: Sometimes information about the derivative of the objective function ''f'' is unavailable, unreliable or impractical to obtain. For example, ''f'' might be non-smooth, or time-consuming to evaluate, or in some way noisy, so that methods that rely on derivatives or approximate them via
finite difference A finite difference is a mathematical expression of the form . Finite differences (or the associated difference quotients) are often used as approximations of derivatives, such as in numerical differentiation. The difference operator, commonly d ...
s are of little use. The problem to find optimal points in such situations is referred to as derivative-free optimization, algorithms that do not use derivatives or finite differences are called derivative-free algorithms.


Introduction

The problem to be solved is to numerically optimize an objective function f\colon A\to\mathbb for some
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
A (usually A\subset\mathbb^n), i.e. find x_0\in A such that without loss of generality f(x_0)\leq f(x) for all x\in A. When applicable, a common approach is to iteratively improve a parameter guess by local hill-climbing in the objective function landscape. Derivative-based algorithms use derivative information of f to find a good search direction, since for example the gradient gives the direction of steepest ascent. Derivative-based optimization is efficient at finding local optima for continuous-domain smooth single-modal problems. However, they can have problems when e.g. A is disconnected, or (mixed-)integer, or when f is expensive to evaluate, or is non-smooth, or noisy, so that (numeric approximations of) derivatives do not provide useful information. A slightly different problem is when f is multi-modal, in which case local derivative-based methods only give local optima, but might miss the global one. In derivative-free optimization, various methods are employed to address these challenges using only function values of f, but no derivatives. Some of these methods can be proved to discover optima, but some are rather metaheuristic since the problems are in general more difficult to solve compared to
convex optimization Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems ...
. For these, the ambition is rather to efficiently find "good" parameter values which can be near-optimal given enough resources, but optimality guarantees can typically not be given. One should keep in mind that the challenges are diverse, so that one can usually not use one algorithm for all kinds of problems.


Algorithms

Notable derivative-free optimization algorithms include: *
Bayesian optimization Bayesian optimization is a sequential design strategy for global optimization of black-box functions, that does not assume any functional forms. It is usually employed to optimize expensive-to-evaluate functions. With the rise of artificial intell ...
* Coordinate descent and adaptive coordinate descent * Differential evolution, including multi-objective variants * DONE *
Evolution strategies Evolution strategy (ES) from computer science is a subclass of evolutionary algorithms, which serves as an optimization (mathematics), optimization technique. It uses the major genetic operators mutation (evolutionary algorithm), mutation, recomb ...
, Natural evolution strategies (
CMA-ES Covariance matrix adaptation evolution strategy (CMA-ES) is a particular kind of strategy for numerical optimization. evolution strategy, Evolution strategies (ES) are stochastic, Derivative-free optimization, derivative-free methods for numerical ...
, xNES, SNES) *
Genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to g ...
s * MCS algorithm * Nelder-Mead method *
Particle swarm optimization In computational science, particle swarm optimization (PSO) is a computational method that Mathematical optimization, optimizes a problem by iterative method, iteratively trying to improve a candidate solution with regard to a given measure of qu ...
* Pattern search * Powell's methods based on interpolation, e.g., COBYLA
PRIMA
*
Random search Random search (RS) is a family of numerical optimization methods that do not require the gradient of the optimization problem, and RS can hence be used on functions that are not continuous or differentiable. Such optimization methods are also kno ...
(including
Luus–Jaakola In computational engineering, Luus–Jaakola (LJ) denotes a heuristic for global optimization of a real-valued function. In engineering use, LJ is not an algorithm that terminates with an optimal solution; nor is it an iterative method that generate ...
) *
Simulated annealing Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
*
Stochastic optimization Stochastic optimization (SO) are optimization methods that generate and use random variables. For stochastic optimization problems, the objective functions or constraints are random. Stochastic optimization also include methods with random iter ...
* Subgradient method * various model-based algorithms lik
BOBYQA
an
ORBIT


Benchmarks

There exist benchmarks for blackbox optimization algorithms, see e.g. the bbob-biobj tests.Using Well-Understood Single-Objective Functions in Multiobjective Black-Box Optimization Test Suites, https://arxiv.org/abs/1604.00359, 2016


See also

*
Mathematical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...


References


External links

*{{cite journal , last=Audet , first=Charles , last2=Kokkolaras , first2=Michael , year=2016 , title=Blackbox and derivative-free optimization: theory, algorithms and applications , journal=Optimization and Engineering , volume=17 , pages=1–2 , doi=10.1007/s11081-016-9307-4 , doi-access=free Optimization algorithms and methods