HOME

TheInfoList



OR:

The truncated Newton method, originated in a paper by Ron Dembo and Trond Steihaug,. Convergence results for this algorithm can be found in . also known as Hessian-free optimization, are a family of
optimization algorithm Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
s designed for optimizing non-linear functions with large numbers of
independent variable A variable is considered dependent if it depends on (or is hypothesized to depend on) an independent variable. Dependent variables are studied under the supposition or demand that they depend, by some law or rule (e.g., by a mathematical function ...
s. A truncated Newton method consists of repeated application of an iterative optimization algorithm to approximately solve Newton's equations, to determine an update to the function's parameters. The inner solver is ''truncated'', i.e., run for only a limited number of iterations. It follows that, for truncated Newton methods to work, the inner solver needs to produce a good approximation in a finite number of iterations;
conjugate gradient In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-semidefinite. The conjugate gradient method is often implemented as an it ...
has been suggested and evaluated as a candidate inner loop. Another prerequisite is good
preconditioning In mathematics, preconditioning is the application of a transformation, called the preconditioner, that conditions a given problem into a form that is more suitable for numerical solving methods. Preconditioning is typically related to reducing ...
for the inner algorithm.


References


Further reading

* * Optimization algorithms and methods {{applied-math-stub