Uncorrelated
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, two real-valued
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, X, Y, are said to be uncorrelated if their
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
, \operatorname ,Y= \operatorname Y- \operatorname \operatorname /math>, is zero. If two variables are uncorrelated, there is no linear relationship between them. Uncorrelated random variables have a
Pearson correlation coefficient In statistics, the Pearson correlation coefficient (PCC, pronounced ) ― also known as Pearson's ''r'', the Pearson product-moment correlation coefficient (PPMCC), the bivariate correlation, or colloquially simply as the correlation coefficient ...
, when it exists, of zero, except in the trivial case when either variable has zero
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
(is a constant). In this case the correlation is undefined. In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
is the expectation of the product, and X and Y are uncorrelated
if and only if In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false. The connective is b ...
\operatorname Y= 0. If X and Y are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
, with finite
second moment In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total ma ...
s, then they are uncorrelated. However, not all uncorrelated variables are independent.


Definition


Definition for two real random variables

Two random variables X,Y are called uncorrelated if their covariance \operatorname ,Y\operatorname X-\operatorname X-\operatorname[X_(Y-\operatorname[Y">.html"_;"title="X-\operatorname[X">X-\operatorname[X_(Y-\operatorname[Y.html" ;"title="_(Y-\operatorname[Y.html" ;"title=".html" ;"title="X-\operatorname[X">X-\operatorname[X (Y-\operatorname[Y">.html" ;"title="X-\operatorname[X">X-\operatorname[X (Y-\operatorname[Y">_(Y-\operatorname[Y.html" ;"title=".html" ;"title="X-\operatorname[X">X-\operatorname[X (Y-\operatorname[Y">.html" ;"title="X-\operatorname[X">X-\operatorname[X (Y-\operatorname[Y/math> is zero.Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, 978-3-319-68074-3 Formally:


Definition for two complex random variables

Two complex random variables Z,W are called uncorrelated if their covariance \operatorname_=\operatorname Z-\operatorname[Z\overline] and their pseudo-covariance \operatorname_=\operatorname Z-\operatorname Z-\operatorname[Z_(W-\operatorname[W">.html"_;"title="Z-\operatorname[Z">Z-\operatorname[Z_(W-\operatorname[W.html" ;"title="_(W-\operatorname[W.html" ;"title=".html" ;"title="Z-\operatorname[Z">Z-\operatorname[Z (W-\operatorname[W">.html" ;"title="Z-\operatorname[Z">Z-\operatorname[Z (W-\operatorname[W">_(W-\operatorname[W.html" ;"title=".html" ;"title="Z-\operatorname[Z">Z-\operatorname[Z (W-\operatorname[W">.html" ;"title="Z-\operatorname[Z">Z-\operatorname[Z (W-\operatorname[W/math> is zero, i.e. Z,W \text \quad \iff \quad \operatorname[Z\overline] = \operatorname[Z] \cdot \operatorname[\overline] \text \operatorname[ZW] = \operatorname[Z] \cdot \operatorname[W]


Definition for more than two random variables

A set of two or more random variables X_1,\ldots,X_n is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix \operatorname_ of the
random vector In probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value ...
\mathbf = (X_1,\ldots,X_n)^\mathrm are all zero. The autocovariance matrix is defined as: :\operatorname_ = \operatorname mathbf,\mathbf= \operatorname \mathbf-\operatorname[\mathbf(\mathbf-\operatorname[\mathbf.html" ;"title="mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf(\mathbf-\operatorname[\mathbf">mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf(\mathbf-\operatorname[\mathbf^]= \operatorname mathbf \mathbf^T- \operatorname mathbfoperatorname mathbfT


Examples of dependence without correlation


Example 1

* Let X be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2. * Let Y be a random variable, ''independent'' of X, that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2. * Let U be a random variable constructed as U=XY. The claim is that U and X have zero covariance (and thus are uncorrelated), but are not independent. Proof: Taking into account that :\operatorname = \operatorname Y= \operatorname \operatorname = \operatorname \cdot 0 = 0, where the second equality holds because X and Y are independent, one gets : \begin \operatorname ,X& = \operatorname U-\operatorname_E[U(X-\operatorname_E[X.html" ;"title=".html" ;"title="U-\operatorname E[U">U-\operatorname E[U(X-\operatorname E[X">.html" ;"title="U-\operatorname E[U">U-\operatorname E[U(X-\operatorname E[X] = \operatorname[ U (X-\tfrac12)] \\ & = \operatorname[X^2 Y - \tfrac12 XY] = \operatorname[(X^2-\tfrac12 X)Y] = \operatorname[(X^2-\tfrac12 X)] \operatorname E = 0 \end Therefore, U and X are uncorrelated. Independence of U and X means that for all a and b, \Pr(U=a\mid X=b) = \Pr(U=a). This is not true, in particular, for a=1 and b=0. * \Pr(U=1\mid X=0) = \Pr(XY=1\mid X=0) = 0 * \Pr(U=1) = \Pr(XY=1) = 1/4 Thus \Pr(U=1\mid X=0)\ne \Pr(U=1) so U and X are not independent. Q.E.D.


Example 2

If X is a continuous random variable uniformly distributed on 1,1/math> and Y = X^2, then X and Y are uncorrelated even though X determines Y and a particular value of Y can be produced by only one or two values of X : f_X(t)= I_ ; f_Y(t)= I_ on the other hand, f_ is 0 on the triangle defined by 0 although f_X \times f_Y is not null on this domain. Therefore f_ (X,Y) \neq f_X (X) \times f_Y (Y) and the variables are not independent. E = = 0 ; E = Cov ,YE \left X-E[X(Y-E[Y.html"_;"title=".html"_;"title="X-E[X">X-E[X(Y-E[Y">.html"_;"title="X-E[X">X-E[X(Y-E[Y_\right_.html" ;"title="">X-E[X(Y-E[Y.html" ;"title=".html" ;"title="X-E[X">X-E[X(Y-E[Y">.html" ;"title="X-E[X">X-E[X(Y-E[Y \right ">">X-E[X(Y-E[Y.html" ;"title=".html" ;"title="X-E[X">X-E[X(Y-E[Y">.html" ;"title="X-E[X">X-E[X(Y-E[Y \right = E \left [X^3- \right ] = =0 Therefore the variables are uncorrelated.


When uncorrelatedness implies independence

There are cases in which uncorrelatedness does imply independence. One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probabi ...
). Further, two jointly normally distributed random variables are independent if they are uncorrelated, although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see
Normally distributed and uncorrelated does not imply independent In probability theory, although simple examples illustrate that linear uncorrelatedness of two random variables does not in general imply their independence, it is sometimes mistakenly thought that it does imply that when the two random variables a ...
).


Generalizations


Uncorrelated random vectors

Two
random vector In probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value ...
s \mathbf=(X_1,\ldots,X_m)^T and \mathbf=(Y_1,\ldots,Y_n)^T are called uncorrelated if :\operatorname mathbf \mathbf^T= \operatorname mathbfoperatorname mathbfT. They are uncorrelated if and only if their
cross-covariance matrix In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the ''i'', ''j'' position is the covariance between the ''i''-th element of a random vector and ''j''-th element of another random vector. A random vect ...
\operatorname_ is zero. Two complex random vectors \mathbf and \mathbf are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if :\operatorname_=\operatorname_=0 where : \operatorname_ =\operatorname \mathbf-\operatorname[\mathbf^.html" ;"title="mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf^">mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf^/math> and : \operatorname_ =\operatorname \mathbf-\operatorname[\mathbf^.html" ;"title="mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf^">mathbf.html" ;"title="\mathbf-\operatorname[\mathbf">\mathbf-\operatorname[\mathbf^/math>.


Uncorrelated stochastic processes

Two stochastic processes \left\ and \left\ are called uncorrelated if their cross-covariance \operatorname_(t_1,t_2) = \operatorname \left[ \left( X(t_1)- \mu_X(t_1) \right) \left( Y(t_2)- \mu_Y(t_2) \right) \right] is zero for all times. Formally: :\left\,\left\ \text \quad :\iff \quad \forall t_1,t_2 \colon \operatorname_(t_1,t_2) = 0.


See also

* Correlation and dependence * Binomial distribution: Covariance between two binomials * Uncorrelated Volume Element


References


Further reading

*''Probability for Statisticians'', Galen R. Shorack, Springer (c2000) {{ISBN, 0-387-98953-6 Covariance and correlation de:Korrelation