HOME

TheInfoList



OR:

The cross-correlation matrix of two
random vector In probability, and statistics, a multivariate random variable or random vector is a list or vector of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge ...
s is a matrix containing as elements the cross-correlations of all pairs of elements of the random vectors. The cross-correlation matrix is used in various digital signal processing algorithms.


Definition

For two
random vector In probability, and statistics, a multivariate random variable or random vector is a list or vector of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge ...
s \mathbf = (X_1,\ldots,X_m)^ and \mathbf = (Y_1,\ldots,Y_n)^, each containing
random element In probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line. The concept was introduced by who commented that the “development of probability theory and expansio ...
s whose
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
and
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
exist, the cross-correlation matrix of \mathbf and \mathbf is defined by and has dimensions m \times n. Written component-wise: :\operatorname_ = \begin \operatorname _1 Y_1& \operatorname _1 Y_2& \cdots & \operatorname _1 Y_n\\ \\ \operatorname _2 Y_1& \operatorname _2 Y_2& \cdots & \operatorname _2 Y_n\\ \\ \vdots & \vdots & \ddots & \vdots \\ \\ \operatorname _m Y_1& \operatorname _m Y_2& \cdots & \operatorname _m Y_n\\ \\ \end The random vectors \mathbf and \mathbf need not have the same dimension, and either might be a scalar value.


Example

For example, if \mathbf = \left( X_1,X_2,X_3 \right)^ and \mathbf = \left( Y_1,Y_2 \right)^ are random vectors, then \operatorname_ is a 3 \times 2 matrix whose (i,j)-th entry is \operatorname _i Y_j/math>.


Complex random vectors

If \mathbf = (Z_1,\ldots,Z_m)^ and \mathbf = (W_1,\ldots,W_n)^ are
complex random vector In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If Z_1,\ldots,Z_n are compl ...
s, each containing random variables whose expected value and variance exist, the cross-correlation matrix of \mathbf and \mathbf is defined by :\operatorname_ \triangleq\ \operatorname mathbf \mathbf^/math> where ^ denotes Hermitian transposition.


Uncorrelatedness

Two random vectors \mathbf=(X_1,\ldots,X_m)^ and \mathbf=(Y_1,\ldots,Y_n)^ are called uncorrelated if :\operatorname mathbf \mathbf^= \operatorname mathbfoperatorname mathbf. They are uncorrelated if and only if their cross-covariance matrix \operatorname_ matrix is zero. In the case of two
complex random vector In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If Z_1,\ldots,Z_n are compl ...
s \mathbf and \mathbf they are called uncorrelated if :\operatorname mathbf \mathbf^= \operatorname mathbfoperatorname mathbf and :\operatorname mathbf \mathbf^= \operatorname mathbfoperatorname mathbf.


Properties


Relation to the cross-covariance matrix

The cross-correlation is related to the ''cross-covariance matrix'' as follows: :\operatorname_ = \operatorname \mathbf - \operatorname[\mathbf(\mathbf - \operatorname[\mathbf">mathbf.html" ;"title="\mathbf - \operatorname[\mathbf">\mathbf - \operatorname[\mathbf(\mathbf - \operatorname[\mathbf^] = \operatorname_ - \operatorname[\mathbf] \operatorname mathbf : Respectively for complex random vectors: :\operatorname_ = \operatorname \mathbf - \operatorname[\mathbf(\mathbf - \operatorname[\mathbf">mathbf.html" ;"title="\mathbf - \operatorname[\mathbf">\mathbf - \operatorname[\mathbf(\mathbf - \operatorname[\mathbf^] = \operatorname_ - \operatorname[\mathbf] \operatorname mathbf


See also

*Autocorrelation *
Correlation does not imply causation The phrase "correlation does not imply causation" refers to the inability to legitimately deduce a cause-and-effect relationship between two events or variables solely on the basis of an observed association or correlation between them. The id ...
*
Covariance function In probability theory and statistics, the covariance function describes how much two random variables change together (their ''covariance'') with varying spatial or temporal separation. For a random field or stochastic process ''Z''(''x'') on a dom ...
*
Pearson product-moment correlation coefficient In statistics, the Pearson correlation coefficient (PCC) is a correlation coefficient that measures linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviation ...
* Correlation function (astronomy) *
Correlation function (statistical mechanics) In statistical mechanics, the correlation function is a measure of the order in a system, as characterized by a mathematical correlation function. Correlation functions describe how microscopic variables, such as spin and density, at different p ...
*
Correlation function (quantum field theory) In quantum field theory, correlation functions, often referred to as correlators or Green's functions, are vacuum expectation values of time-ordered products of field operators. They are a key object of study in quantum field theory where the ...
*
Mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
* Rate distortion theory *
Radial distribution function In statistical mechanics, the radial distribution function, (or pair correlation function) g(r) in a system of particles (atoms, molecules, colloids, etc.), describes how density varies as a function of distance from a reference particle. If ...


References


Further reading

* Hayes, Monson H., ''Statistical Digital Signal Processing and Modeling'', John Wiley & Sons, Inc., 1996. . * Solomon W. Golomb, and Guang Gong
Signal design for good correlation: for wireless communication, cryptography, and radar
Cambridge University Press, 2005. * M. Soltanalian
Signal Design for Active Sensing and Communications
Uppsala Dissertations from the Faculty of Science and Technology (printed by Elanders Sverige AB), 2014. {{DEFAULTSORT:Correlation Function Covariance and correlation Time series Spatial analysis Matrices (mathematics) Signal processing