In
numerical optimization
Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson
for finding optimal parameter settings of a
genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
.
Meta-optimization and related concepts are also known in the literature as meta-evolution, super-optimization, automated parameter calibration,
hyper-heuristics A hyper-heuristic is a heuristic search method that seeks to automate, often by the incorporation of machine learning techniques, the process of selecting, combining, generating or adapting several simpler heuristics (or components of such heuristic ...
, etc.
Motivation
Optimization methods such as
genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
and
differential evolution
In evolutionary computation, differential evolution (DE) is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as ...
have several parameters that govern their behaviour and efficiency in optimizing a given problem and these parameters must be chosen by the practitioner to achieve satisfactory results. Selecting the behavioural parameters by hand is a laborious task that is susceptible to human misconceptions of what makes the optimizer perform well.
The behavioural parameters of an optimizer can be varied and the optimization performance plotted as a landscape. This is computationally feasible for optimizers with few behavioural parameters and optimization problems that are fast to compute, but when the number of behavioural parameters increases the time usage for computing such a performance landscape increases exponentially. This is the
curse of dimensionality
The curse of dimensionality refers to various phenomena that arise when analyzing and organizing data in high-dimensional spaces that do not occur in low-dimensional settings such as the three-dimensional physical space of everyday experience. The ...
for the search-space consisting of an optimizer's behavioural parameters. An efficient method is therefore needed to search the space of behavioural parameters.
Methods
A simple way of finding good behavioural parameters for an optimizer is to employ another overlaying optimizer, called the
meta
Meta (from the Greek μετά, '' meta'', meaning "after" or "beyond") is a prefix meaning "more comprehensive" or "transcending".
In modern nomenclature, ''meta''- can also serve as a prefix meaning self-referential, as a field of study or ende ...
-optimizer. There are different ways of doing this depending on whether the behavioural parameters to be tuned are
real-valued
In mathematics, value may refer to several, strongly related notions.
In general, a mathematical value may be any definite mathematical object. In elementary mathematics, this is most often a number – for example, a real number such as or an i ...
or
discrete-valued, and depending on what performance measure is being used, etc.
Meta-optimizing the parameters of a
genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
was done by Grefenstette
[ and Keane,][ amongst others, and experiments with meta-optimizing both the parameters and the genetic operators were reported by Bäck.][ Meta-optimization of the COMPLEX-RF algorithm was done by Krus and Andersson,][ and,][ where performance index of optimization based on information theory was introduced and further developed. Meta-optimization of particle swarm optimization was done by Meissner et al.,][ Pedersen and Chipperfield,][ and Mason et al.][ Pedersen and Chipperfield applied meta-optimization to ]differential evolution
In evolutionary computation, differential evolution (DE) is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as ...
.[ Birattari et al.][ meta-optimized ]ant colony optimization
In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs. Artificial ants stand for multi ...
. Statistical models
A statistical model is a mathematical model that embodies a set of statistical assumptions concerning the generation of sample data (and similar data from a larger population). A statistical model represents, often in considerably idealized form, ...
have also been used to reveal more about the relationship between choices of behavioural parameters and optimization performance, see for example Francois and Lavergne,[ and Nannen and Eiben.][ A comparison of various meta-optimization techniques was done by Smit and Eiben.][
]
See also
* Automated machine learning
Automated machine learning (AutoML) is the process of automating the tasks of applying machine learning to real-world problems. AutoML potentially includes every stage from beginning with a raw dataset to building a machine learning model ready ...
(AutoML)
* Hyper-heuristics A hyper-heuristic is a heuristic search method that seeks to automate, often by the incorporation of machine learning techniques, the process of selecting, combining, generating or adapting several simpler heuristics (or components of such heuristic ...
References
{{reflist, refs=
[
{{cite journal
, last=Mercer
, first=R.E.
, author2=Sampson, J.R.
, title=Adaptive search using a reproductive metaplan
, journal=Kybernetes
, year=1978
, volume=7
, pages=215–228
, doi=10.1108/eb005486
, issue=3
]
[
{{cite journal
, doi=10.1109/TSMC.1986.289288
, last=Grefenstette
, first=J.J.
, title=Optimization of control parameters for genetic algorithms
, journal=IEEE Transactions on Systems, Man, and Cybernetics
, year=1986
, volume=16
, pages=122–128
, issue=1
, s2cid=23313487
]
[
{{cite journal
, doi=10.1016/0954-1810(95)95751-Q
, last=Keane
, first=A.J.
, title=Genetic algorithm optimization in multi-peak problems: studies in convergence and robustness
, journal=Artificial Intelligence in Engineering
, year=1995
, volume=9
, pages=75–83
, issue=2
]
[
{{cite journal
, last1=Meissner
, first1=M.
, last2=Schmuker
, first2=M.
, last3=Schneider
, first3=G.
, title=Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training
, journal=BMC Bioinformatics
, year=2006
, volume=7
, doi=10.1186/1471-2105-7-125
, pmid=16529661
, pmc=1464136
, issue=1
, pages=125
]
[
{{cite journal
, doi=10.1016/j.asoc.2009.08.029
, last=Pedersen
, first=M.E.H.
, author2=Chipperfield, A.J.
, title=Simplifying particle swarm optimization
, journal=Applied Soft Computing
, year=2010
, volume=10
, pages=618–628
, issue=2
, citeseerx=10.1.1.149.8300
]
[
{{cite journal
, last=Mason
, first=Karl
, author2=Duggan, Jim
, author3=Howley, Enda
, title=A Meta Optimisation Analysis of Particle Swarm Optimisation Velocity Update Equations for Watershed Management Learning
, journal=Applied Soft Computing
, year=2018
, volume=62
, pages=148–161
, doi =10.1016/j.asoc.2017.10.018
]
[
{{cite book
, type=PhD thesis
, title=Tuning & Simplifying Heuristical Optimization
, url=https://pdfs.semanticscholar.org/a5d2/8c26a2e2824170d67b69f14120cf67cabe26.pdf
, archive-url=https://web.archive.org/web/20200213130101/https://pdfs.semanticscholar.org/a5d2/8c26a2e2824170d67b69f14120cf67cabe26.pdf
, url-status=dead
, archive-date=2020-02-13
, last=Pedersen
, first=M.E.H.
, year=2010
, publisher=University of Southampton, School of Engineering Sciences, Computational Engineering and Design Group
, s2cid=107805461
]
[
{{cite journal
, last1=Francois
, first1=O.
, last2=Lavergne
, first2=C.
, title=Design of evolutionary algorithms - a statistical perspective
, journal=IEEE Transactions on Evolutionary Computation
, year=2001
, volume=5
, pages=129–148
, issue=2
, doi=10.1109/4235.918434
]
[
{{cite conference
, last1=Birattari
, first1=M.
, last2=Stützle
, first2=T.
, last3=Paquete
, first3=L.
, last4=Varrentrapp
, first4=K.
, title=A racing algorithm for configuring metaheuristics
, book-title=Proceedings of the Genetic and Evolutionary Computation Conference (GECCO)
, year=2002
, pages=11–18
, url=https://www.researchgate.net/publication/220740639
]
[
{{cite conference
, last1=Krus
, first1=PK.
, last2=Andersson (Ölvander)
, first2=J.
, title=Optimizing optimization for design optimization
, book-title=Proceedings of DETC’03 2003 ASME Design Engineering Technical Conferences and Computers and Information in Engineering Conference Chicago, Illinois, USA
, year=2003
]
[
{{cite journal
, last1=Krus
, first1=PK.
, last2=Ölvander(Andersson)
, first2=J.
, url=http://liu.diva-portal.org/smash/get/diva2:572570/FULLTEXT01.pdf
, title=Performance index and meta-optimization of a direct search optimization method
, journal=Engineering Optimization
, year=2013
, volume=45
, pages=1167–1185
, issue=10
, doi=10.1080/0305215X.2012.725052
, bibcode=2013EnOp...45.1167K
, s2cid=62731978
]
[
{{cite book
, type=PhD thesis
, title=The Problem of Tuning Metaheuristics as Seen from a Machine Learning Perspective
, url=http://iridia.ulb.ac.be/~mbiro/paperi/BirattariPhD.pdf
, last=Birattari
, first=M.
, year=2004
, publisher=Université Libre de Bruxelles
]
[
{{cite conference
, last1=Smit
, first1=S.K.
, last2=Eiben
, first2=A.E.
, title=Comparing parameter tuning methods for evolutionary algorithms
, book-title=Proceedings of the IEEE Congress on Evolutionary Computation (CEC)
, year=2009
, pages=399–406
, url=http://www.few.vu.nl/~gusz/papers/2009-CEC-tuning-methods.pdf
]
[
{{cite conference
, last1=Bäck
, first1=T.
, title=Parallel optimization of evolutionary algorithms
, book-title=Proceedings of the International Conference on Evolutionary Computation
, year=1994
, pages=418–427
]
[
{{cite conference
, last1=Nannen
, first1=V.
, last2=Eiben
, first2=A.E.
, title=A method for parameter calibration and relevance estimation in evolutionary algorithms
, book-title=Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation (GECCO)
, year=2006
, pages=183–190
, url=http://srv.uib.es/pub/1012.pdf
]
Evolutionary algorithms
Heuristics