HOME

TheInfoList



OR:

In mathematics, error analysis is the study of kind and quantity of
error An error (from the Latin ''error'', meaning "wandering") is an action which is inaccurate or incorrect. In some usages, an error is synonymous with a mistake. The etymology derives from the Latin term 'errare', meaning 'to stray'. In statistics ...
, or uncertainty, that may be present in the solution to a problem. This issue is particularly prominent in applied areas such as
numerical analysis Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic computation, symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of ...
and
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
.


Error analysis in numerical modeling

In numerical simulation or modeling of real systems, error analysis is concerned with the changes in the output of the model as the parameters to the model
vary Vary ( uk, Вари, hu, Vári or Mezővári) is a village in Zakarpattia Oblast (province) of western Ukraine. It is located around southeast of Berehove at the confluence of the rivers Tisza and Borzsova, not far from the Ukrainian- Hungar ...
about a
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
. For instance, in a system modeled as a function of two variables z \,=\, f(x,y). Error analysis deals with the propagation of the
numerical error In software engineering and mathematics, numerical error is the error in the numerical computations. Types It can be the combined effect of two kinds of error in a calculation. * the first is caused by the finite precision of computations involv ...
s in x and y (around mean values \bar and \bar) to error in z (around a mean \bar). In numerical analysis, error analysis comprises both forward error analysis and backward error analysis.


Forward error analysis

Forward error analysis involves the analysis of a function z' = f'(a_0,\,a_1,\,\dots,\,a_n) which is an approximation (usually a finite polynomial) to a function z \,=\, f(a_0,a_1,\dots,a_n) to determine the bounds on the error in the approximation; i.e., to find \epsilon such that 0 \,\le\, , z - z', \,\le\, \epsilon . The evaluation of forward errors is desired in
validated numerics Validated numerics, or rigorous computation, verified computation, reliable computation, numerical verification (german: Zuverlässiges Rechnen) is numerics including mathematically strict error (rounding error, truncation error, discretization er ...
.


Backward error analysis

Backward error analysis involves the analysis of the approximation function z' \,=\, f'(a_0,\,a_1,\,\dots,\,a_n) , to determine the bounds on the parameters a_i \,=\, \bar \,\pm\, \epsilon_i such that the result z' \,=\, z . Backward error analysis, the theory of which was developed and popularized by James H. Wilkinson, can be used to establish that an algorithm implementing a numerical function is numerically stable. The basic approach is to show that although the calculated result, due to roundoff errors, will not be exactly correct, it is the exact solution to a nearby problem with slightly perturbed input data. If the perturbation required is small, on the order of the uncertainty in the input data, then the results are in some sense as accurate as the data "deserves". The algorithm is then defined as '' backward stable''. Stability is a measure of the sensitivity to rounding errors of a given numerical procedure; by contrast, the condition number of a function for a given problem indicates the inherent sensitivity of the function to small perturbations in its input and is independent of the implementation used to solve the problem.


Applications


Global positioning system

The analysis of errors computed using the
global positioning system The Global Positioning System (GPS), originally Navstar GPS, is a satellite-based radionavigation system owned by the United States government and operated by the United States Space Force. It is one of the global navigation satellite sy ...
is important for understanding how GPS works, and for knowing what magnitude errors should be expected. The Global Positioning System makes corrections for receiver clock errors and other effects but there are still residual errors which are not corrected. The Global Positioning System (GPS) was created by the United States Department of Defense (DOD) in the 1970s. It has come to be widely used for navigation both by the U.S. military and the general public.


Molecular dynamics simulation

In
molecular dynamics Molecular dynamics (MD) is a computer simulation method for analyzing the physical movements of atoms and molecules. The atoms and molecules are allowed to interact for a fixed period of time, giving a view of the dynamic "evolution" of the ...
(MD) simulations, there are errors due to inadequate sampling of the phase space or infrequently occurring events, these lead to the statistical error due to random fluctuation in the measurements. For a series of measurements of a fluctuating property , the mean value is: \langle A \rangle = \frac \sum_^M A_. When these measurements are independent, the variance of the mean is: \sigma^( \langle A \rangle ) = \frac \sigma^( A ), but in most MD simulations, there is correlation between quantity at different time, so the variance of the mean will be underestimated as the effective number of independent measurements is actually less than . In such situations we rewrite the variance as: \sigma^( \langle A \rangle ) = \frac \sigma^(A) \left 1 + 2 \sum_\mu \left( 1 - \frac \right) \phi_ \right where \phi_ is the
autocorrelation function Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variabl ...
defined by \phi_ = \frac. We can then use the auto correlation function to estimate the
error bar Error bars are graphical representations of the variability of data and used on graphs to indicate the error or uncertainty in a reported measurement. They give a general idea of how precise a measurement is, or conversely, how far from the repor ...
. Luckily, we have a much simpler method based on block averaging.D. C. Rapaport, ''The Art of Molecular Dynamics Simulation'', Cambridge University Press.


Scientific data verification

Measurements generally have a small amount of error, and repeated measurements of the same item will generally result in slight differences in readings. These differences can be analyzed, and follow certain known mathematical and statistical properties. Should a set of data appear to be too faithful to the hypothesis, i.e., the amount of error that would normally be in such measurements does not appear, a conclusion can be drawn that the data may have been forged. Error analysis alone is typically not sufficient to prove that data have been falsified or fabricated, but it may provide the supporting evidence necessary to confirm suspicions of misconduct.


See also

*
Error analysis (linguistics) In linguistics, according to J. Richard et al., (2002), an error is the use of a word, speech act or grammatical items in such a way that it seems imperfect and significant of an incomplete learning (184). It is considered by Norrish (1983, p.  ...
*
Error bar Error bars are graphical representations of the variability of data and used on graphs to indicate the error or uncertainty in a reported measurement. They give a general idea of how precise a measurement is, or conversely, how far from the repor ...
*
Errors and residuals in statistics In statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "true value" (not necessarily observable). The er ...
*
Propagation of uncertainty In statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors, more specifically random errors) on the uncertainty of a function based on them. When the variables are the values of exp ...
*
Validated numerics Validated numerics, or rigorous computation, verified computation, reliable computation, numerical verification (german: Zuverlässiges Rechnen) is numerics including mathematically strict error (rounding error, truncation error, discretization er ...


References


External links



All about error analysis. {{Authority control Numerical analysis Error