Asymptotics
   HOME

TheInfoList



OR:

In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing Limit (mathematics), limiting behavior. As an illustration, suppose that we are interested in the properties of a function as becomes very large. If , then as becomes very large, the term becomes insignificant compared to . The function is said to be "''asymptotically equivalent'' to , as ". This is often written symbolically as , which is read as " is asymptotic to ". An example of an important asymptotic result is the prime number theorem. Let denote the prime-counting function (which is not directly related to the constant pi), i.e. is the number of prime numbers that are less than or equal to . Then the theorem states that \pi(x)\sim\frac. Asymptotic analysis is commonly used in computer science as part of the analysis of algorithms and is often expressed there in terms of big O notation.


Definition

Formally, given functions and , we define a binary relation f(x) \sim g(x) \quad (\text x\to\infty) if and only if \lim_ \frac = 1. The symbol is the tilde. The relation is an equivalence relation on the set of functions of ; the functions and are said to be ''asymptotically equivalent''. The Domain of a function, domain of and can be any set for which the limit is defined: e.g. real numbers, complex numbers, positive integers. The same notation is also used for other ways of passing to a limit: e.g. , , . The way of passing to the limit is often not stated explicitly, if it is clear from the context. Although the above definition is common in the literature, it is problematic if is zero infinitely often as goes to the limiting value. For that reason, some authors use an alternative definition. The alternative definition, in little-o notation, is that if and only if f(x)=g(x)(1+o(1)). This definition is equivalent to the prior definition if is not zero in some Neighbourhood (mathematics), neighbourhood of the limiting value.


Properties

If f(x) \sim g(x) and a(x) \sim b(x), as x \to \infty, then the following hold: * f^r \sim g^r, for every real * \log(f) \sim \log(g) if \lim g \neq 1 * f\times a \sim g\times b * f / a \sim g / b Such properties allow asymptotically-equivalent functions to be freely exchanged in many algebraic expressions. Note that those properties are only correct if and only if x tends to infinity (in other words, those properties are only applied for sufficiently large value of x ). If x does not tend to infinity, but instead to some arbitrary finite constants c , then the following limit from above definition: \lim_ \frac ≠ 1 , for some constant c Similarly: \lim_ \frac ≠ 1 , for some constant c Thus, those respective functions are no longer asymptotically-equivalent and cannot be applied above properties. A simple example for this, let f(x) = + 2x and g(x) = , we can see that: \lim_ \frac = 1 However: \lim_ \frac = 9 Hence, f(x) and g(x) are not asymptotically-equivalent as x \to 0.5 .


Examples of asymptotic formulas

* Factorial n! \sim \sqrt\left(\frac\right)^n —this is Stirling's approximation * Partition (number theory)#Partition function, Partition function For a positive integer ''n'', the partition function, ''p''(''n''), gives the number of ways of writing the integer ''n'' as a sum of positive integers, where the order of addends is not considered. p(n)\sim \frac e^ * Airy function The Airy function, Ai(''x''), is a solution of the differential equation ; it has many applications in physics. \operatorname(x) \sim \frac * Hankel functions \begin H_\alpha^(z) &\sim \sqrt e^ \\ H_\alpha^(z) &\sim \sqrt e^ \end


Asymptotic expansion

An asymptotic expansion of a Finite field is in practice an expression of that function in terms of a series (mathematics), series, the partial sums of which do not necessarily converge, but such that taking any initial partial sum provides an asymptotic formula for . The idea is that successive terms provide an increasingly accurate description of the order of growth of . In symbols, it means we have f \sim g_1, but also f - g_1 \sim g_2 and f - g_1 - \cdots - g_ \sim g_ for each fixed ''k''. In view of the definition of the \sim symbol, the last equation means f - (g_1 + \cdots + g_k) = o(g_k) in the Big O notation#Little-o notation, little o notation, i.e., f - (g_1 + \cdots + g_k) is much smaller than g_k. The relation f - g_1 - \cdots - g_ \sim g_ takes its full meaning if g_ = o(g_k) for all ''k'', which means the g_k form an asymptotic scale. In that case, some authors may Abuse of notation, abusively write f \sim g_1 + \cdots + g_k to denote the statement f - (g_1 + \cdots + g_k) = o(g_k). One should however be careful that this is not a standard use of the \sim symbol, and that it does not correspond to the definition given in . In the present situation, this relation g_ = o(g_) actually follows from combining steps ''k'' and ''k''−1; by subtracting f - g_1 - \cdots - g_ = g_ + o(g_) from f - g_1 - \cdots - g_ - g_ = g_ + o(g_), one gets g_ + o(g_)=o(g_), i.e. g_ = o(g_). In case the asymptotic expansion does not converge, for any particular value of the argument there will be a particular partial sum which provides the best approximation and adding additional terms will decrease the accuracy. This optimal partial sum will usually have more terms as the argument approaches the limit value.


Examples of asymptotic expansions

* Gamma function \frac \Gamma(x+1) \sim 1+\frac+\frac-\frac-\cdots \ (x \to \infty) * Exponential integral xe^xE_1(x) \sim \sum_^\infty \frac \ (x \to \infty) * Error function \sqrtx e^\operatorname(x) \sim 1+\sum_^\infty (-1)^n \frac \ (x \to \infty) where is the double factorial.


Worked example

Asymptotic expansions often occur when an ordinary series is used in a formal expression that forces the taking of values outside of its domain of convergence. For example, we might start with the ordinary series \frac=\sum_^\infty w^n The expression on the left is valid on the entire complex plane w \ne 1, while the right hand side converges only for , w, < 1. Multiplying by e^ and integrating both sides yields \int_0^\infty \frac \, dw = \sum_^\infty t^ \int_0^\infty e^ u^n \, du The integral on the left hand side can be expressed in terms of the exponential integral. The integral on the right hand side, after the substitution u=w/t, may be recognized as the gamma function. Evaluating both, one obtains the asymptotic expansion e^ \operatorname\left(\frac\right) = \sum _^\infty n! \; t^ Here, the right hand side is clearly not convergent for any non-zero value of ''t''. However, by keeping ''t'' small, and truncating the series on the right to a finite number of terms, one may obtain a fairly good approximation to the value of \operatorname(1/t). Substituting x = -1/t and noting that \operatorname(x) = -E_1(-x) results in the asymptotic expansion given earlier in this article.


Asymptotic distribution

In mathematical statistics, an asymptotic distribution is a hypothetical distribution that is in a sense the "limiting" distribution of a sequence of distributions. A distribution is an ordered set of random variables for , for some positive integer . An asymptotic distribution allows to range without bound, that is, is infinite. A special case of an asymptotic distribution is when the late entries go to zero—that is, the go to 0 as goes to infinity. Some instances of "asymptotic distribution" refer only to this special case. This is based on the notion of an asymptotic function which cleanly approaches a constant value (the ''asymptote'') as the independent variable goes to infinity; "clean" in this sense meaning that for any desired closeness epsilon there is some value of the independent variable after which the function never differs from the constant by more than epsilon. An asymptote is a straight line that a curve approaches but never meets or crosses. Informally, one may speak of the curve meeting the asymptote "at infinity" although this is not a precise definition. In the equation y = \frac, ''y'' becomes arbitrarily small in magnitude as ''x'' increases.


Applications

Asymptotic analysis is used in several mathematical sciences. In statistics, asymptotic theory provides limiting approximations of the probability distribution of sample statistics, such as the Likelihood-ratio test, likelihood ratio statistic and the expected value of the deviance (statistics), deviance. Asymptotic theory does not provide a method of evaluating the finite-sample distributions of sample statistics, however. Non-asymptotic bounds are provided by methods of approximation theory. Examples of applications are the following. * In applied mathematics, asymptotic analysis is used to build numerical methods to approximate equation solutions. * In mathematical statistics and probability theory, asymptotics are used in analysis of long-run or large-sample behaviour of random variables and estimators. * In computer science in the analysis of algorithms, considering the performance of algorithms. * The behavior of physical systems, an example being statistical mechanics. * In accident analysis when identifying the causation of crash through count modeling with large number of crash counts in a given time and space. Asymptotic analysis is a key tool for exploring the ordinary differential equation, ordinary and partial differential equation, partial differential equations which arise in the mathematical modelling of real-world phenomena.Howison, S. (2005),
Practical Applied Mathematics
', Cambridge University Press
An illustrative example is the derivation of the Boundary layer#Boundary layer equations, boundary layer equations from the full Navier-Stokes equations governing fluid flow. In many cases, the asymptotic expansion is in power of a small parameter, : in the boundary layer case, this is the dimensional analysis, nondimensional ratio of the boundary layer thickness to a typical length scale of the problem. Indeed, applications of asymptotic analysis in mathematical modelling often center around a nondimensional parameter which has been shown, or assumed, to be small through a consideration of the scales of the problem at hand. Asymptotic expansions typically arise in the approximation of certain integrals (Laplace's method, saddle-point method, method of steepest descent) or in the approximation of probability distributions (Edgeworth series). The Feynman graphs in quantum field theory are another example of asymptotic expansions which often do not converge.


See also

*Asymptote *Asymptotic computational complexity *Asymptotic density (in number theory) *Asymptotic theory (statistics) *Asymptotology *Big O notation *Leading-order term *Method of dominant balance (for ODEs) *Method of matched asymptotic expansions *Watson's lemma


Notes


References

* * * * * * {{citation, last1=Paris, first1= R. B., last2= Kaminsky, first2= D. , year=2001, title= Asymptotics and Mellin-Barnes Integrals, publisher= Cambridge University Press, url=https://www.researchgate.net/publication/39064661


External links


''Asymptotic Analysis''
 —home page of the journal, which is published by IOS Press
A paper on time series analysis using asymptotic distribution
Asymptotic analysis, Mathematical series