HOME

TheInfoList



OR:

Deterministic global optimization is a branch of numerical optimization which focuses on finding the global solutions of an optimization problem whilst providing theoretical guarantees that the reported solution is indeed the global one, within some predefined tolerance. The term "deterministic global optimization" typically refers to complete or rigorous (see below) optimization methods. Rigorous methods converge to the global optimum in finite time. Deterministic global optimization methods are typically used when locating the global solution is a necessity (i.e. when the only naturally occurring state described by a mathematical model is the global minimum of an optimization problem), when it is extremely difficult to find a feasible solution, or simply when the user desires to locate the best possible solution to a problem.


Overview

Neumaier classified global optimization methods in four categories, depending on their degree of rigour with which they approach the optimum, as follows: * An incomplete method uses clever intuitive heuristics for searching but has no safeguards if the search gets stuck in a local minimum * An asymptotically complete method reaches a global minimum with certainty or at least with probability one if allowed to run indefinitely long, but has no means to know when a global minimizer has been found. * A complete method reaches a global minimum with certainty, assuming exact computations and indefinitely long run time, and knows after a finite time that an approximate global minimizer has been found (to within prescribed tolerances). * A rigorous method reaches a global minimum with certainty and within given tolerances even in the presence of rounding errors, except in near-degenerate cases, where the tolerances may be exceeded. Deterministic global optimization methods typically belong to the last two categories. Note that building a rigorous piece of software is extremely difficult as the process requires that all the dependencies are also coded rigorously. Deterministic global optimization methods require ways to rigorously bound function values over regions of space. One could say that a main difference between deterministic and non-deterministic methods in this context is that the former perform calculations over regions of the solution space, whereas the latter perform calculations on single points. This is either done by exploiting particular functional forms (e.g. McCormick relaxations), or using interval analysis in order to work with more general functional forms. In any case bounding is required, which is why deterministic global optimization methods cannot give a rigorous result when working with
black-box In science, computing, and engineering, a black box is a system which can be viewed in terms of its inputs and outputs (or transfer characteristics), without any knowledge of its internal workings. Its implementation is "opaque" (black). The te ...
code, unless that code is explicitly written to also return function bounds. For this reason, it is common for problems in deterministic global optimization to be represented using a computational graph, as it is straightforward to overload all operators such that the resulting function values or derivatives yield interval (rather than scalar) results.


Classes of deterministic global optimization problems


Linear programming Linear programming (LP), also called linear optimization, is a method to achieve the best outcome (such as maximum profit or lowest cost) in a mathematical model whose requirements are represented by linear function#As a polynomial function, li ...
problems (LP)

Linear programming problems are a highly desirable formulation for any practical problem. The reason is that, with the rise of interior-point algorithms, it is possible to efficiently solve very large problems (involving hundreds of thousands or even millions of variables) to global optimality. Linear programming optimization problems strictly fall under the category of deterministic global optimization.


Mixed-integer linear programming problems (MILP)

Much like linear programming problems, MILPs are very important when solving decision-making models. Efficient algorithms for solving complex problems of this type are known and are available in the form of solvers such as CPLEX.


Non-linear programming In mathematics, nonlinear programming (NLP) is the process of solving an optimization problem where some of the constraints or the objective function are nonlinear. An optimization problem is one of calculation of the extrema (maxima, minima or sta ...
problems (NLP)

Non-linear programming problems are extremely challenging in deterministic global optimization. The order of magnitude that a modern solver can be expected to handle in reasonable time is roughly 100 to a few hundreds of non-linear variables. At the time of this writing, there exist no parallel solvers for the deterministic solution of NLPs, which accounts for the complexity gap between deterministic LP and NLP programming.


Mixed-integer non-linear programming problems (MINLP)

Even more challenging than their NLP counterparts, deterministically solving an MINLP problem can be very difficult. Techniques such as integer cuts, or branching a problem on its integer variables (hence creating NLP sub-problems which can in turn be solved deterministically), are commonly used.


Zero-order methods

Zero-order methods consist of methods which make use of zero-order interval arithmetic. A representative example is interval bisection.


First-order methods

First-order methods consist of methods which make use of first-order information, e.g., interval gradients or interval slopes.


Second-order methods

Second-order methods make use of second-order information, usually eigenvalue bounds derived from interval Hessian matrices. One of the most general second-order methodologies for handling problems of general type is the
αΒΒ αΒΒ is a second-order deterministic global optimization algorithm for finding the optima of general, twice continuously differentiable functions.0 a valid lower bound on eigenvalue \lambda_i may be derived from the i-th row of H(X) as follows: ...
algorithm.


Deterministic global optimization solvers

* ANTIGONE: Algorithms for coNTinuous / Integer Global Optimization of Nonlinear Equations). It is a proprietary software, available through ANTIGONE the GAMS modelling platform. *
BARON Baron is a rank of nobility or title of honour, often hereditary, in various European countries, either current or historical. The female equivalent is baroness. Typically, the title denotes an aristocrat who ranks higher than a lord or knig ...
: BARON is available under the AIMMS, AMPL, and GAMS modeling language and on the NEOS Server. It is a proprietary software *
Couenne Convex Over and Under ENvelopes for Nonlinear Estimation (Couenne) is an open-source library for solving global optimization problems, also termed mixed integer nonlinear optimization problems. A global optimization problem requires to minimize ...
: Convex Over and Under ENvelopes for Nonlinear Estimation (Couenne) is an open-source library * EAGO: Easy-Advanced Global Optimization (EAGO) is an open-source solver in Julia (programming language). It is developed by the University of Connecticut. * LINDO (Linear, Interactive, and Discrete Optimizer) includes global optimization capabilities. * MAiNGO: McCormick-based Algorithm for mixed-integer Nonlinear Global Optimization (MAiNGO) is a C++ package with MPI and openMP parallelization and provided open-source under Eclipse Public License - v 2.0. *
Octeract Engine Octeract Engine is a proprietary deterministic global optimization solver for general Mixed-Integer Nonlinear Programs (MINLP).. It claims to use MPI as a means of accelerating solution times. Up to now there hasn't been any publication supporti ...
is a proprietary solver with parallelization capabilities. It is developed and licensed by Octeract * SCIP: SCIP is an open-source suite of optimization solvers which among others solves mixed integer nonlinear programming (MINLP)


References

{{Reflist Mathematical optimization