Householder's Method
   HOME

TheInfoList



OR:

In
mathematics Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, and more specifically in
numerical analysis Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic computation, symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of ...
, Householder's methods are a class of
root-finding algorithm In numerical analysis, a root-finding algorithm is an algorithm for finding zeros, also called "roots", of continuous functions. A zero of a function is a number such that . As, generally, the zeros of a function cannot be computed exactly nor ...
s that are used for functions of one real variable with continuous derivatives up to some order . Each of these methods is characterized by the number , which is known as the ''order'' of the method. The algorithm is iterative and has an
order of convergence In mathematical analysis, particularly numerical analysis, the rate of convergence and order of convergence of a sequence that converges to a limit are any of several characterizations of how quickly that sequence approaches its limit. These are ...
of . These methods are named after the American mathematician
Alston Scott Householder Alston Scott Householder (5 May 1904 – 4 July 1993) was an American mathematician who specialized in mathematical biology and numerical analysis. He is the inventor of the Householder transformation and of Householder's method. Career Hous ...
. The case of corresponds to
Newton's method In numerical analysis, the Newton–Raphson method, also known simply as Newton's method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a ...
; the case of corresponds to
Halley's method In numerical analysis, Halley's method is a root-finding algorithm used for functions of one real variable with a continuous second derivative. Edmond Halley was an English mathematician and astronomer who introduced the method now called by his ...
.


Method

Householder's method is a numerical algorithm for solving the equation . In this case, the function has to be a function of one real variable. The method consists of a sequence of iterations x_ = x_n + d\; \frac beginning with an initial guess . If is a times continuously differentiable function and is a zero of but not of its derivative, then, in a neighborhood of , the iterates satisfy: , x_ - a , \le K \cdot ^ , for some K > 0.\! This means that the iterates converge to the zero if the initial guess is sufficiently close, and that the convergence has order or better. Furthermore, when close enough to , it commonly is the case that x_ - a \approx C (x_n - a)^ for some C \ne 0. In particular, * if is even and then convergence to will be from values greater than ; * if is even and then convergence to will be from values less than ; * if is odd and then convergence to will be from the side where it starts; and * if is odd and then convergence to will alternate sides. Despite their order of convergence, these methods are not widely used because the gain in precision is not commensurate with the rise in effort for large . The ''Ostrowski index'' expresses the error reduction in the number of function evaluations instead of the iteration count. * For polynomials, the evaluation of the first derivatives of at using
Horner's method In mathematics and computer science, Horner's method (or Horner's scheme) is an algorithm for polynomial evaluation. Although named after William George Horner, this method is much older, as it has been attributed to Joseph-Louis Lagrange by Hor ...
has an effort of polynomial evaluations. Since evaluations over iterations give an error exponent of , the exponent for one function evaluation is \sqrt +1/math>, numerically , , , for , and falling after that. By this criterion, the case (
Halley's method In numerical analysis, Halley's method is a root-finding algorithm used for functions of one real variable with a continuous second derivative. Edmond Halley was an English mathematician and astronomer who introduced the method now called by his ...
) is the optimal value of . * For general functions the derivative evaluation using the Taylor arithmetic of
automatic differentiation In mathematics and computer algebra, automatic differentiation (auto-differentiation, autodiff, or AD), also called algorithmic differentiation, computational differentiation, and differentiation arithmetic Hend Dawood and Nefertiti Megahed (2023) ...
requires the equivalent of function evaluations. One function evaluation thus reduces the error by an exponent of \sqrt frac/math>, which is \sqrt approx 1.2599 for Newton's method, \sqrt approx 1.2009 for Halley's method and falling towards 1 or linear convergence for the higher order methods.


Motivation


First approach

Suppose is analytic in a neighborhood of and . Then has a
Taylor series In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor ser ...
at and its constant term is zero. Because this constant term is zero, the function will have a Taylor series at and, when , its constant term will not be zero. Because that constant term is not zero, it follows that the reciprocal has a Taylor series at , which we will write as \sum_^ \frac and its constant term will not be zero. Using that Taylor series we can write \frac = \frac + \sum_^ \frac\,. When we compute its -th derivative, we note that the terms for conveniently vanish: \left(\frac\right)^ = \frac + \sum_^ \frac = \frac \left(1 + \frac\sum_^ \frac\right) = \frac \left(1 + \mathcal\left((x-a)^\right)\right) \,, using
big O notation Big ''O'' notation is a mathematical notation that describes the asymptotic analysis, limiting behavior of a function (mathematics), function when the Argument of a function, argument tends towards a particular value or infinity. Big O is a memb ...
. We thus get that the correction term that we add to to get a value of that is closer to is: d~\frac = d~\frac (x-a) \left( \frac \right) = -\left( (x-a) + \mathcal\left((x-a)^\right) \right)\,. Thus, x + d~\frac goes to .


Second approach

Suppose is a simple root. Then near , is a
meromorphic function In the mathematical field of complex analysis, a meromorphic function on an open subset ''D'' of the complex plane is a function that is holomorphic on all of ''D'' ''except'' for a set of isolated points, which are ''poles'' of the function. ...
. Suppose we have the
Taylor expansion In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor ser ...
: (1/f)(x) = \sum_^ \frac (x-b)^d around a point that is closer to than it is to any other zero of . By König's theorem, we have: a-b = \lim_ \frac=d\frac. These suggest that Householder's iteration might be a good convergent iteration. The actual proof of the convergence is also based on these ideas.


The methods of lower order

Householder's method of order 1 is just
Newton's method In numerical analysis, the Newton–Raphson method, also known simply as Newton's method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a ...
, since: \begin x_ =& x_n + 1\,\frac \\ 7em=& x_n + \frac\cdot\left(\frac\right)^\\ 7em=& x_n - \frac. \end For Householder's method of order 2 one gets
Halley's method In numerical analysis, Halley's method is a root-finding algorithm used for functions of one real variable with a continuous second derivative. Edmond Halley was an English mathematician and astronomer who introduced the method now called by his ...
, since the identities \textstyle (1/f)'(x)=-\frac\ and \textstyle\ (1/f)''(x)=-\frac+2\frac result in \begin x_ =& x_n + 2\,\frac \\ em=& x_n + \frac\\ em=& x_n - \frac\\ em=& x_n + h_n\;\frac. \end In the last line, h_n=-\tfrac is the update of the Newton iteration at the point x_n. This line was added to demonstrate where the difference to the simple Newton's method lies. The third order method is obtained from the identity of the third order derivative of \textstyle (1/f)(x)=-\frac+6\frac-6\frac and has the formula \begin x_ =& x_n + 3\,\frac \\ em=& x_n - \frac\\ em=& x_n + h_n\frac \end and so on.


Example

The first problem solved by Newton with the Newton-Raphson-Simpson method was the polynomial equation y^3-2y-5=0. He observed that there should be a solution close to 2. Replacing transforms the equation into 0=f(x)=-1 + 10 x + 6 x^2 + x^3. The Taylor series of the reciprocal function starts with \begin 1/f(x)=& - 1 - 10\,x - 106 \,x^2 - 1121 \,x^3 - 11856 \,x^4 - 125392 \,x^5\\ & - 1326177 \,x^6 - 14025978 \,x^7 - 148342234 \,x^8 - 1568904385 \,x^9\\ & - 16593123232 \,x^ +O(x^) \end The result of applying Householder's methods of various orders at is also obtained by dividing neighboring coefficients of the latter
power series In mathematics, a power series (in one variable) is an infinite series of the form \sum_^\infty a_n \left(x - c\right)^n = a_0 + a_1 (x - c) + a_2 (x - c)^2 + \dots where ''a_n'' represents the coefficient of the ''n''th term and ''c'' is a co ...
. For the first orders one gets the following values after just one iteration step: For an example, in the case of the 3rd order, x_1 = 0.0 + 106/1121 = 0.09455842997324. As one can see, there are a little bit more than correct decimal places for each order d. The first one hundred digits of the correct solution are . Let's calculate the x_2, x_3, x_4 values for some lowest order, f = -1 + 10x + 6x^2 + x^3 f^\prime = 10 + 12x + 3x^2 f^ = 12 + 6x f^ = 6 And using following relations, : 1st order; x_ = x_ -f(x_i)/f^(x_i) : 2nd order; x_ = x_ - 2ff^ / (2^2 - ff^) : 3rd order; x_ = x_ - (6f ^2 - 3f^2 f^) / (6^3 -6 f f^f^ + f^2f^)


Derivation

An exact derivation of Householder's methods starts from the Padé approximation of order of the function, where the approximant with linear
numerator A fraction (from , "broken") represents a part of a whole or, more generally, any number of equal parts. When spoken in everyday English, a fraction describes how many parts of a certain size there are, for example, one-half, eight-fifths, thre ...
is chosen. Once this has been achieved, the update for the next approximation results from computing the unique zero of the numerator. The Padé approximation has the form f(x+h)=\frac+O(h^). The rational function has a zero at h=-a_0. Just as the Taylor polynomial of degree has coefficients that depend on the function , the Padé approximation also has coefficients dependent on and its derivatives. More precisely, in any Padé approximant, the degrees of the numerator and denominator polynomials have to add to the order of the approximant. Therefore, b_d=0 has to hold. One could determine the Padé approximant starting from the Taylor polynomial of using Euclid's algorithm. However, starting from the Taylor polynomial of is shorter and leads directly to the given formula. Since (1/f)(x+h) = (1/f)(x)+(1/f)'(x)h+\cdots+(1/f)^(x)\frac+(1/f)^(x)\frac+O(h^) has to be equal to the inverse of the desired rational function, we get after multiplying with a_0+h in the power h^d the equation 0=b_d=a_0(1/f)^(x)\frac1+(1/f)^(x)\frac1. Now, solving the last equation for the zero h=-a_0 of the numerator results in \begin h&=-a_0= \frac\\ &=d\,\frac \end. This implies the iteration formula x_ = x_n + d\; \frac .


Relation to Newton's method

Householder's method applied to the real-valued function is the same as applying Newton's method x_ = x_n - \frac to find the zeros of the function: g(x) = \left, (1/f)^\^\,. In particular, gives Newton's method unmodified and gives Halley's method.


References


External links

* {{cite web, title=Newton's method and high order iteration, url=http://numbers.computation.free.fr/Constants/Algorithms/newton.html, author=Pascal Sebah and Xavier Gourdon, year=2001 ''Note'': Use the
PostScript PostScript (PS) is a page description language and dynamically typed, stack-based programming language. It is most commonly used in the electronic publishing and desktop publishing realm, but as a Turing complete programming language, it c ...
version of this link; the website version is not compiled correctly. Root-finding algorithms