Motivation
At each step, the LJ heuristic maintains a box from which it samples points randomly, using a uniform distribution on the box. For a unimodal function, the probability of reducing the objective function decreases as the box approach a minimum. The picture displays a one-dimensional example.Heuristic
Let ''f'': ℝ''n'' → ℝ be the fitness or cost function which must be minimized. Let x ∈ ℝ''n'' designate a position or candidate solution in the search-space. The LJ heuristic iterates the following steps: * Initialize x ~ ''U''(blo,bup) with a random uniform position in the search-space, where blo and bup are the lower and upper boundaries, respectively. * Set the initial sampling range to cover the entire search-space (or a part of it): d = bup − blo * Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following: ** Pick a random vector a ~ ''U''(−d, d) ** Add this to the current position x to create the new potential position y = x + a ** If (''f''(y) < ''f''(x)) then move to the new position by setting x = y, otherwise decrease the sampling-range: d = ''0.95'' d * Now x holds the best-found position.Variations
Luus notes that ARS (Adaptive Random Search) algorithms proposed to date differ in regard to many aspects. * Procedure of generating random trial points. * Number of internal loops (NIL, the number of random search points in each cycle). * Number of cycles (NEL, number of external loops). * Contraction coefficient of the search region size. (Some example values are 0.95 to 0.60.) ** Whether the region reduction rate is the same for all variables or a different rate for each variable (called the M-LJ algorithm). ** Whether the region reduction rate is a constant or follows another distribution (e.g. Gaussian). * Whether to incorporate a line search. * Whether to consider constraints of the random points as acceptance criteria, or to incorporate a quadratic penalty.Convergence
Nair proved a convergence analysis. For twice continuously differentiable functions, the LJ heuristic generates a sequence of iterates having a convergent subsequence. For this class of problems, Newton's method is the usual optimization method, and it has quadratic convergence (''regardless of the dimension'' of the space, which can be a"The catastrophic growthWhen applied to twice continuously differentiable problems, the LJ heuristic's rate of convergence decreases as the number of dimensions increases.n the number of iterations needed to reach an approximate solution of a given accuracy N, or n, is the fourteenth letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''en'' (pronounced ), plural ''ens''. History ...ashe number of dimensions increases to infinity He or HE may refer to: Language * He (pronoun), an English pronoun * He (kana), the romanization of the Japanese kana へ * He (letter), the fifth letter of many Semitic alphabets * He (Cyrillic), a letter of the Cyrillic script called ''He'' ...shows that it is meaningless to pose the question of constructing universal methods of solving ... problems of any appreciable dimensionality 'generally'. It is interesting to note that the same onclusionholds for ... problems generated by uni-extremalhat is, unimodal A hat is a head covering which is worn for various reasons, including protection against weather conditions, ceremonial reasons such as university graduation, religious reasons, safety, or as a fashion accessory. Hats which incorporate mech ...(but not convex) functions." Page 7 summarizes the later discussion of .
See also
* Random optimization is a related family of optimization methods that sample from general distributions, for example the uniform distribution. * Random search is a related family of optimization methods that sample from general distributions, for example, a uniform distribution on the unit sphere. * Pattern search are used on noisy observations, especially inReferences
{{DEFAULTSORT:Luus-Jaakola Optimization algorithms and methods Heuristic algorithms