Gauss–Legendre Method
   HOME

TheInfoList



OR:

In
numerical analysis Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of numerical methods ...
and
scientific computing Computational science, also known as scientific computing or scientific computation (SC), is a field in mathematics that uses advanced computing capabilities to understand and solve complex problems. It is an area of science that spans many disc ...
, the Gauss–Legendre methods are a family of
numerical methods for ordinary differential equations Numerical methods for ordinary differential equations are methods used to find numerical approximations to the solutions of ordinary differential equations (ODEs). Their use is also known as "numerical integration", although this term can also ...
. Gauss–Legendre methods are implicit
Runge–Kutta methods In numerical analysis, the Runge–Kutta methods ( ) are a family of implicit and explicit iterative methods, which include the Euler method, used in temporal discretization for the approximate solutions of simultaneous nonlinear equations. The ...
. More specifically, they are
collocation method In mathematics, a collocation method is a method for the numerical solution of ordinary differential equations, partial differential equations and integral equations. The idea is to choose a finite-dimensional space of candidate solutions (usually ...
s based on the points of
Gauss–Legendre quadrature In numerical analysis, Gauss–Legendre quadrature is a form of Gaussian quadrature for approximating the definite integral of a function. For integrating over the interval , the rule takes the form: :\int_^1 f(x)\,dx \approx \sum_^n w_i f(x_i) ...
. The Gauss–Legendre method based on ''s'' points has order 2''s''. All Gauss–Legendre methods are A-stable. The Gauss–Legendre method of order two is the
implicit midpoint rule Implicit may refer to: Mathematics * Implicit function * Implicit function theorem * Implicit curve * Implicit surface * Implicit differential equation Other uses * Implicit assumption, in logic * Implicit-association test, in social psychology ...
. Its
Butcher tableau A butcher is a person who may slaughter animals, dress their flesh, sell their meat, or participate within any combination of these three tasks. They may prepare standard cuts of meat and poultry for sale in retail or wholesale food establishm ...
is: : The Gauss–Legendre method of order four has Butcher tableau: : The Gauss–Legendre method of order six has Butcher tableau: : The computational cost of higher-order Gauss–Legendre methods is usually excessive, and thus, they are rarely used.


Intuition

Gauss-Legendre Runge-Kutta (GLRK) methods solve an
ordinary differential equation In mathematics, an ordinary differential equation (ODE) is a differential equation whose unknown(s) consists of one (or more) function(s) of one variable and involves the derivatives of those functions. The term ''ordinary'' is used in contrast ...
\dot = f(x) with x(0) = x_0. The distinguishing feature of GLRK is the estimation of x(h) - x_0 = \int_0^h dt\, f \circ x(t) with
Gaussian quadrature In numerical analysis, a quadrature rule is an approximation of the definite integral of a function, usually stated as a weighted sum of function values at specified points within the domain of integration. (See numerical integration for mor ...
. : x(h) = x(0) + \frac \sum_^l w_i k_i + O(h^), where k_i = f\circ x( h c_i) are the sampled velocities, w_i are the quadrature weights, c_i = \frac(1+r_i) are the abscissas, and r_i are the roots P_l(r_i) = 0 of the Legendre polynomial of degree l. A further approximation is needed, as k_i is still impossible to evaluate. To maintain truncation error of order O(h^), we only need k_i to order O(h^). The Runge-Kutta implicit definition k_i = f\left(x_0 + h \sum_j a_ k_j \right) is invoked to accomplish this. This is an implicit constraint that must be solved by a root finding algorithm like Newton's method. The values of the Runge-Kutta parameters a_ can be determined from a
Taylor series In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor ser ...
expansion in h.


Practical example

The Gauss-Legendre methods are implicit, so in general they cannot be applied exactly. Instead one makes an educated guess of k_i , and then uses Newton's method to converge arbitrarily close to the true solution. Below is a Matlab function which implements the Gauss-Legendre method of order four. % starting point x = 10.5440; 4.1124; 35.8233 dt = 0.01; N = 10000; x_series = for i = 1:N x = gauss_step(x, @lorenz_dynamics, dt, 1e-7, 1, 100); x_series = _series x end plot3( x_series(1,:), x_series(2,:), x_series(3,:) ); set(gca,'xtick',[],'ytick',[],'ztick',[]); title('Lorenz Attractor'); return; function [td, j] = lorenz_dynamics(state) % return a time derivative and a Jacobian of that time derivative x = state(1); y = state(2); z = state(3); sigma = 10; beta = 8/3; rho = 28; td = igma*(y-x); x*(rho-z)-y; x*y-beta*z j = sigma, sigma, 0; rho-z, -1, -x; y, x, -beta end function x_next = gauss_step( x, dynamics, dt, threshold, damping, max_iterations ) ,~= size(x); sq3 = sqrt(3); if damping > 1 , , damping <= 0 error('damping should be between 0 and 1.') end % Use explicit Euler steps as initial guesses ,~= dynamics(x); x1_guess = x + (1/2-sq3/6)*dt*k; x2_guess = x + (1/2+sq3/6)*dt*k; 1,~= dynamics(x1_guess); 2,~= dynamics(x2_guess); a11 = 1/4; a12 = 1/4 - sq3/6; a21 = 1/4 + sq3/6; a22 = 1/4; error = @(k1, k2) 1 - dynamics(x+(a11*k1+a12*k2)*dt); k2 - dynamics(x+(a21*k1+a22*k2)*dt) er = error(k1, k2); iteration=1; while (norm(er) > threshold && iteration < max_iterations) fprintf('Newton iteration %d: error is %f.\n', iteration, norm(er) ); iteration = iteration + 1; , j1= dynamics(x+(a11*k1+a12*k2)*dt); , j2= dynamics(x+(a21*k1+a22*k2)*dt); j = ye(d) - dt*a11*j1, -dt*a12*j1; -dt*a21*j2, eye(d) - dt*a22*j2 k_next = 1;k2- damping * linsolve(j, er); k1 = k_next(1:d); k2 = k_next(d+(1:d)); er = error(k1, k2); end if norm(er) > threshold error('Newton did not converge by %d iterations.', max_iterations); end x_next = x + dt / 2 * (k1 + k2); end This algorithm is surprisingly cheap. The error in k_i can fall below 10^ in as few as 2 Newton steps. The only extra work compared to explicit Runge-Kutta methods is the computation of the Jacobian.


Time-symmetric variants

At the cost of adding an additional implicit relation, these methods can be adapted to have time reversal symmetry. In these methods, the averaged position (x_f+x_i)/2 is used in computing k_i instead of just the initial position x_i in standard Runge-Kutta methods. The method of order 2 is just an implicit midpoint method. : k_1 = f\left(\frac\right) : x_f = x_i + h k_1 The method of order 4 with 2 stages is as follows. : k_1 = f\left( \frac - \frac h k_2\right) : k_2 = f\left( \frac + \frac h k_1\right) : x_f = x_i + \frac(k_1 + k_2) The method of order 6 with 3 stages is as follows. : k_1 = f\left( \frac - \frac h k_2 - \frac h k_3 \right) : k_2 = f\left( \frac + \frac h k_1 - \frac h k_3 \right) : k_3 = f\left( \frac + \frac h k_1 + \frac h k_2 \right) : x_f = x_i + \frac( 5 k_1 + 8k_2 + 5k_3)


Notes


References

* . {{DEFAULTSORT:Gauss-Legendre method Runge–Kutta methods