HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
and
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, a stationary process (or a strict/strictly stationary process or strong/strongly stationary process) is a
stochastic process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appea ...
whose unconditional
joint probability distribution Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
does not change when shifted in time. Consequently, parameters such as
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
and
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
also do not change over time. If you draw a line through the middle of a stationary process then it should be flat; it may have 'seasonal' cycles, but overall it does not trend up nor down. Since stationarity is an assumption underlying many statistical procedures used in
time series analysis In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Exa ...
, non-stationary data are often transformed to become stationary. The most common cause of violation of stationarity is a trend in the mean, which can be due either to the presence of a
unit root In probability theory and statistics, a unit root is a feature of some stochastic processes (such as random walks) that can cause problems in statistical inference involving time series models. A linear stochastic process has a unit root if 1 is ...
or of a deterministic trend. In the former case of a unit root, stochastic shocks have permanent effects, and the process is not mean-reverting. In the latter case of a deterministic trend, the process is called a
trend-stationary process In the statistical analysis of time series, a trend-stationary process is a stochastic process from which an underlying trend (function solely of time) can be removed, leaving a stationary process. The trend does not have to be linear. Converse ...
, and stochastic shocks have only transitory effects after which the variable tends toward a deterministically evolving (non-constant) mean. A trend stationary process is not strictly stationary, but can easily be transformed into a stationary process by removing the underlying trend, which is solely a function of time. Similarly, processes with one or more unit roots can be made stationary through differencing. An important type of non-stationary process that does not include a trend-like behavior is a
cyclostationary process A cyclostationary process is a signal having statistical properties that vary cyclically with time. A cyclostationary process can be viewed as multiple interleaved stationary processes. For example, the maximum daily temperature in New York City ca ...
, which is a stochastic process that varies cyclically with time. For many applications strict-sense stationarity is too restrictive. Other forms of stationarity such as wide-sense stationarity or ''N''-th-order stationarity are then employed. The definitions for different kinds of stationarity are not consistent among different authors (see Other terminology).


Strict-sense stationarity


Definition

Formally, let \left\ be a
stochastic process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appea ...
and let F_(x_, \ldots, x_) represent the
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Ev ...
of the unconditional (i.e., with no reference to any particular starting value)
joint distribution Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
of \left\ at times t_1 + \tau, \ldots, t_n + \tau. Then, \left\ is said to be strictly stationary, strongly stationary or strict-sense stationary if Since \tau does not affect F_X(\cdot), F_ is not a function of time.


Examples

White noise In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density. The term is used, with this or similar meanings, in many scientific and technical disciplines, ...
is the simplest example of a stationary process. An example of a
discrete-time In mathematical dynamics, discrete time and continuous time are two alternative frameworks within which variables that evolve over time are modeled. Discrete time Discrete time views values of variables as occurring at distinct, separate "po ...
stationary process where the sample space is also discrete (so that the random variable may take one of ''N'' possible values) is a
Bernoulli scheme In mathematics, the Bernoulli scheme or Bernoulli shift is a generalization of the Bernoulli process to more than two possible outcomes. Bernoulli schemes appear naturally in symbolic dynamics, and are thus important in the study of dynamical sy ...
. Other examples of a discrete-time stationary process with continuous sample space include some
autoregressive In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
and
moving average In statistics, a moving average (rolling average or running average) is a calculation to analyze data points by creating a series of averages of different subsets of the full data set. It is also called a moving mean (MM) or rolling mean and is ...
processes which are both subsets of the
autoregressive moving average model In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are where
unit root In probability theory and statistics, a unit root is a feature of some stochastic processes (such as random walks) that can cause problems in statistical inference involving time series models. A linear stochastic process has a unit root if 1 is ...
s exist in the model.


Example 1

Let Y be any scalar
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
, and define a time-series \left\, by :X_t=Y \qquad \text t. Then \left\ is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A
law of large numbers In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials shou ...
does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined by Y, rather than taking the
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
of Y. The time average of X_t does not converge since the process is not
ergodic In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies tha ...
.


Example 2

As a further example of a stationary process for which any single realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2\pi] and define the time series \left\ by :X_t=\cos (t+Y) \quad \text t \in \mathbb. Then \left\ is strictly stationary since ( (t+ Y) modulo 2 \pi ) follows the same uniform distribution as Y for any t .


Example 3

Keep in mind that a
white noise In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density. The term is used, with this or similar meanings, in many scientific and technical disciplines, ...
is not necessarily strictly stationary. Let \omega be a random variable uniformly distributed in the interval (0, 2\pi) and define the time series \left\ z_t=\cos(t\omega) \quad (t=1,2,...) Then : \begin \mathbb(z_t) &= \frac \int_0^ \cos(t\omega) \,d\omega = 0,\\ \operatorname(z_t) &= \frac \int_0^ \cos^2(t\omega) \,d\omega = 1/2,\\ \operatorname(z_t , z_j) &= \frac \int_0^ \cos(t\omega)\cos(j\omega) \,d\omega = 0 \quad \forall t\neq j. \end So \ is a white noise, however it is not strictly stationary.


''N''th-order stationarity

In , the distribution of n samples of the stochastic process must be equal to the distribution of the samples shifted in time ''for all'' n. ''N''-th-order stationarity is a weaker form of stationarity where this is only requested for all n up to a certain order N. A random process \left\ is said to be ''N''-th-order stationary if:


Weak or wide-sense stationarity


Definition

A weaker form of stationarity commonly employed in
signal processing Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, and scientific measurements. Signal processing techniq ...
is known as weak-sense stationarity, wide-sense stationarity (WSS), or covariance stationarity. WSS random processes only require that 1st moment (i.e. the mean) and
autocovariance In probability theory and statistics, given a stochastic process, the autocovariance is a function that gives the covariance of the process with itself at pairs of time points. Autocovariance is closely related to the autocorrelation of the process ...
do not vary with respect to time and that the 2nd moment is finite for all times. Any strictly stationary process which has a finite
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
and a
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the les ...
is also WSS. So, a
continuous time In mathematical dynamics, discrete time and continuous time are two alternative frameworks within which variables that evolve over time are modeled. Discrete time Discrete time views values of variables as occurring at distinct, separate "po ...
random process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appe ...
\left\ which is WSS has the following restrictions on its mean function m_X(t) \triangleq \operatorname E _t/math> and
autocovariance In probability theory and statistics, given a stochastic process, the autocovariance is a function that gives the covariance of the process with itself at pairs of time points. Autocovariance is closely related to the autocorrelation of the process ...
function K_(t_1, t_2) \triangleq \operatorname E X_-m_X(t_1))(X_-m_X(t_2))/math>: The first property implies that the mean function m_X(t) must be constant. The second property implies that the autocovariance function depends only on the ''difference'' between t_1 and t_2 and only needs to be indexed by one variable rather than two variables. Thus, instead of writing, :\,\!K_(t_1 - t_2, 0)\, the notation is often abbreviated by the substitution \tau = t_1 - t_2: :K_(\tau) \triangleq K_(t_1 - t_2, 0) This also implies that the
autocorrelation Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variable ...
depends only on \tau = t_1 - t_2, that is :\,\! R_X(t_1,t_2) = R_X(t_1-t_2,0) \triangleq R_X(\tau). The third property says that the second moments must be finite for any time t.


Motivation

The main advantage of wide-sense stationarity is that it places the time-series in the context of
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
s. Let ''H'' be the Hilbert space generated by (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows from
Bochner's theorem In mathematics, Bochner's theorem (named for Salomon Bochner) characterizes the Fourier transform of a positive finite Borel measure on the real line. More generally in harmonic analysis, Bochner's theorem asserts that under Fourier transform a c ...
that there exists a positive measure \mu on the real line such that ''H'' is isomorphic to the Hilbert subspace of ''L''2(''μ'') generated by . This then gives the following Fourier-type decomposition for a continuous time stationary stochastic process: there exists a stochastic process \omega_\xi with
orthogonal increments In mathematics, orthogonality is the generalization of the geometric notion of ''perpendicularity''. By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...
such that, for all t :X_t = \int e^ \, d \omega_\lambda, where the integral on the right-hand side is interpreted in a suitable (Riemann) sense. The same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle. When processing WSS random signals with
linear Linearity is the property of a mathematical relationship (''function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear r ...
,
time-invariant In control theory, a time-invariant (TIV) system has a time-dependent system function that is not a direct function of time. Such systems are regarded as a class of systems in the field of system analysis. The time-dependent system function is ...
( LTI)
filter Filter, filtering or filters may refer to: Science and technology Computing * Filter (higher-order function), in functional programming * Filter (software), a computer program to process a data stream * Filter (video), a software component tha ...
s, it is helpful to think of the correlation function as a
linear operator In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
. Since it is a
circulant In linear algebra, a circulant matrix is a square matrix in which all row vectors are composed of the same elements and each row vector is rotated one element to the right relative to the preceding row vector. It is a particular kind of Toeplitz ...
operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the
eigenfunction In mathematics, an eigenfunction of a linear operator ''D'' defined on some function space is any non-zero function f in that space that, when acted upon by ''D'', is only multiplied by some scaling factor called an eigenvalue. As an equation, th ...
s of LTI operators are also
complex exponential The exponential function is a mathematical function denoted by f(x)=\exp(x) or e^x (where the argument is written as an exponent). Unless otherwise specified, the term generally refers to the positive-valued function of a real variable, al ...
s, LTI processing of WSS random signals is highly tractable—all computations can be performed in the
frequency domain In physics, electronics, control systems engineering, and statistics, the frequency domain refers to the analysis of mathematical functions or signals with respect to frequency, rather than time. Put simply, a time-domain graph shows how a signa ...
. Thus, the WSS assumption is widely employed in signal processing
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specificat ...
s.


Definition for complex stochastic process

In the case where \left\ is a complex stochastic process the
autocovariance In probability theory and statistics, given a stochastic process, the autocovariance is a function that gives the covariance of the process with itself at pairs of time points. Autocovariance is closely related to the autocorrelation of the process ...
function is defined as K_(t_1, t_2) = \operatorname E X_-m_X(t_1))\overline/math> and, in addition to the requirements in , it is required that the pseudo-autocovariance function J_(t_1, t_2) = \operatorname E X_-m_X(t_1))(X_-m_X(t_2))/math> depends only on the time lag. In formulas, \left\ is WSS, if


Joint stationarity

The concept of stationarity may be extended to two stochastic processes.


Joint strict-sense stationarity

Two stochastic processes \left\ and \left\ are called jointly strict-sense stationary if their joint cumulative distribution F_(x_ ,\ldots, x_,y_ ,\ldots, y_) remains unchanged under time shifts, i.e. if


Joint (''M'' + ''N'')th-order stationarity

Two random processes \left\ and \left\ is said to be jointly (''M'' + ''N'')-th-order stationary if:


Joint weak or wide-sense stationarity

Two stochastic processes \left\ and \left\ are called jointly wide-sense stationary if they are both wide-sense stationary and their cross-covariance function K_(t_1, t_2) = \operatorname E
X_-m_X(t_1))(Y_-m_Y(t_2)) X, or x, is the twenty-fourth and third-to-last Letter (alphabet), letter in the Latin alphabet, used in the English alphabet, modern English alphabet, the alphabets of other western European languages and others worldwide. Its English a ...
/math> depends only on the time difference \tau = t_1 - t_2. This may be summarized as follows:


Relation between types of stationarity

* If a stochastic process is ''N''-th-order stationary, then it is also ''M''-th-order stationary for all . * If a stochastic process is second order stationary (N=2) and has finite second moments, then it is also wide-sense stationary. * If a stochastic process is wide-sense stationary, it is not necessarily second-order stationary. * If a stochastic process is strict-sense stationary and has finite second moments, it is wide-sense stationary. * If two stochastic processes are jointly (''M'' + ''N'')-th-order stationary, this does not guarantee that the individual processes are ''M''-th- respectively ''N''-th-order stationary.


Other terminology

The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow. *
Priestley Priestley may refer to: Places * Priestley, West Virginia, US, an unincorporated community * Priestley Glacier, a major valley glacier in Antarctica * Priestley (lunar crater), on the far side of the Moon * Priestley (Martian crater) * 5577 P ...
uses stationary up to order ''m'' if conditions similar to those given here for wide sense stationarity apply relating to moments up to order ''m''. Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of second-order stationarity given here. * Honarkhah and Caers also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain. * Tahmasebi and Sahimi have presented an adaptive Shannon-based methodology that can be used for modeling of any non-stationary systems.


Differencing

One way to make some time series stationary is to compute the differences between consecutive observations. This is known as
differencing In statistics and econometrics, and in particular in time series analysis, an autoregressive integrated moving average (ARIMA) model is a generalization of an autoregressive moving average (ARMA) model. Both of these models are fitted to time ser ...
. Differencing can help stabilize the mean of a time series by removing changes in the level of a time series, and so eliminating trends. This can also remove seasonality, if differences are taken appropriately (e.g. differencing observations 1 year apart to remove year-lo). Transformations such as logarithms can help to stabilize the variance of a time series. One of the ways for identifying non-stationary times series is the ACF plot. Sometimes, seasonal patterns will be more visible in the ACF plot than in the original time series; however, this is not always the case. Nonstationary time series can look stationary Another approach to identifying non-stationarity is to look at the
Laplace transform In mathematics, the Laplace transform, named after its discoverer Pierre-Simon Laplace (), is an integral transform In mathematics, an integral transform maps a function from its original function space into another function space via integra ...
of a series, which will identify both exponential trends and sinusoidal seasonality (complex exponential trends). Related techniques from
signal analysis Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as sound, images, and scientific measurements. Signal processing techniques are used to optimize transmissions, di ...
such as the
wavelet transform In mathematics, a wavelet series is a representation of a square-integrable (real number, real- or complex number, complex-valued) function (mathematics), function by a certain orthonormal series (mathematics), series generated by a wavelet. This ...
and
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed, ...
may also be helpful.


See also

*
Lévy process In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which disp ...
*
Stationary ergodic process In probability theory, a stationary ergodic process is a stochastic process which exhibits both stationarity and ergodicity. In essence this implies that the random process will not change its statistical properties with time and that its statistic ...
*
Wiener–Khinchin theorem In applied mathematics, the Wiener–Khinchin theorem or Wiener–Khintchine theorem, also known as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary ...
*
Ergodicity In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies th ...
* Statistical regularity *
Autocorrelation Autocorrelation, sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variable ...
*
Whittle likelihood In statistics, Whittle likelihood is an approximation to the likelihood function of a stationary Gaussian time series. It is named after the mathematician and statistician Peter Whittle, who introduced it in his PhD thesis in 1951. It is commonly u ...


References


Further reading

* * * Hyndman, Athanasopoulos (2013). Forecasting: Principles and Practice. Otexts. https://www.otexts.org/fpp/8/1


External links


Spectral decomposition of a random function (Springer)
{{Statistics, analysis Stochastic processes Signal processing