HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and statistics, a cross-covariance matrix is a
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
whose element in the ''i'', ''j'' position is the
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
between the ''i''-th element of a
random vector In probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value ...
and ''j''-th element of another random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a
scalar Scalar may refer to: *Scalar (mathematics), an element of a field, which is used to define a vector space, usually the field of real numbers * Scalar (physics), a physical quantity that can be described by a single element of a number field such ...
random variable. Each element has either a finite number of ''observed'' empirical values or a finite or infinite number of ''potential'' values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions. The cross-covariance matrix of two random vectors \mathbf and \mathbf is typically denoted by \operatorname_ or \Sigma_.


Definition

For
random vector In probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value ...
s \mathbf and \mathbf, each containing
random element In probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line. The concept was introduced by who commented that the “development of probability theory and expansi ...
s whose expected value and
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
exist, the cross-covariance matrix of \mathbf and \mathbf is defined by where \mathbf = \operatorname mathbf/math> and \mathbf = \operatorname mathbf/math> are vectors containing the expected values of \mathbf and \mathbf. The vectors \mathbf and \mathbf need not have the same dimension, and either might be a scalar value. The cross-covariance matrix is the matrix whose (i,j) entry is the
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
:\operatorname_ = \operatorname _i, Y_j= \operatorname X_i_-_\operatorname[X_i(Y_j_-_\operatorname[Y_j.html" ;"title="_i.html" ;"title="X_i - \operatorname[X_i">X_i - \operatorname[X_i(Y_j - \operatorname[Y_j">_i.html" ;"title="X_i - \operatorname[X_i">X_i - \operatorname[X_i(Y_j - \operatorname[Y_j] between the ''i''-th element of \mathbf and the ''j''-th element of \mathbf. This gives the following component-wise definition of the cross-covariance matrix. : \operatorname_= \begin \mathrm X_1_-_\operatorname[X_1(Y_1_-_\operatorname[Y_1.html" ;"title="_1.html" ;"title="X_1 - \operatorname[X_1">X_1 - \operatorname[X_1(Y_1 - \operatorname[Y_1">_1.html" ;"title="X_1 - \operatorname[X_1">X_1 - \operatorname[X_1(Y_1 - \operatorname[Y_1] & \mathrm X_1 - \operatorname[X_1(Y_2 - \operatorname[Y_2])] & \cdots & \mathrm X_1_-_\operatorname[X_1(Y_n_-_\operatorname[Y_n.html" ;"title="_1.html" ;"title="X_1 - \operatorname[X_1">X_1 - \operatorname[X_1(Y_n - \operatorname[Y_n">_1.html" ;"title="X_1 - \operatorname[X_1">X_1 - \operatorname[X_1(Y_n - \operatorname[Y_n] \\ \\ \mathrm[(X_2 - \operatorname[X_2])(Y_1 - \operatorname[Y_1])] & \mathrm[(X_2 - \operatorname[X_2])(Y_2 - \operatorname[Y_2])] & \cdots & \mathrm[(X_2 - \operatorname[X_2])(Y_n - \operatorname _n] \\ \\ \vdots & \vdots & \ddots & \vdots \\ \\ \mathrm X_m_-_\operatorname[X_m(Y_1_-_\operatorname[Y_1.html" ;"title="_m.html" ;"title="X_m - \operatorname[X_m">X_m - \operatorname[X_m(Y_1 - \operatorname[Y_1">_m.html" ;"title="X_m - \operatorname[X_m">X_m - \operatorname[X_m(Y_1 - \operatorname[Y_1] & \mathrm X_m - \operatorname[X_m(Y_2 - \operatorname[Y_2])] & \cdots & \mathrm X_m_-_\operatorname[X_m(Y_n_-_\operatorname__n.html" ;"title="_m.html" ;"title="X_m - \operatorname[X_m">X_m - \operatorname[X_m(Y_n - \operatorname _n">_m.html" ;"title="X_m - \operatorname[X_m">X_m - \operatorname[X_m(Y_n - \operatorname _n\end


Example

For example, if \mathbf = \left( X_1,X_2,X_3 \right)^ and \mathbf = \left( Y_1,Y_2 \right)^ are random vectors, then \operatorname(\mathbf,\mathbf) is a 3 \times 2 matrix whose (i,j)-th entry is \operatorname(X_i,Y_j).


Properties

For the cross-covariance matrix, the following basic properties apply: # \operatorname(\mathbf,\mathbf) = \operatorname[\mathbf \mathbf^] - \mathbf \mathbf^ # \operatorname(\mathbf,\mathbf) = \operatorname(\mathbf,\mathbf)^ # \operatorname(\mathbf + \mathbf,\mathbf) = \operatorname(\mathbf,\mathbf) + \operatorname(\mathbf, \mathbf) # \operatorname(A\mathbf+ \mathbf, B^\mathbf + \mathbf) = A\, \operatorname(\mathbf, \mathbf) \,B # If \mathbf and \mathbf are independent (or somewhat less restrictedly, if every random variable in \mathbf is uncorrelated with every random variable in \mathbf), then \operatorname(\mathbf,\mathbf) = 0_ where \mathbf, \mathbf and \mathbf are random p \times 1 vectors, \mathbf is a random q \times 1 vector, \mathbf is a q \times 1 vector, \mathbf is a p \times 1 vector, A and B are q \times p matrices of constants, and 0_ is a p \times q matrix of zeroes.


Definition for complex random vectors

If \mathbf and \mathbf are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition: :\operatorname_ = \operatorname(\mathbf,\mathbf) \stackrel\ \operatorname \mathbf-\mathbf)(\mathbf-\mathbf)^/math> For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows: :\operatorname_ = \operatorname(\mathbf,\overline) \stackrel\ \operatorname \mathbf-\mathbf)(\mathbf-\mathbf)^/math>


Uncorrelatedness

Two random vectors \mathbf and \mathbf are called uncorrelated if their cross-covariance matrix \operatorname_ matrix is a zero matrix. Complex random vectors \mathbf and \mathbf are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if \operatorname_ = \operatorname_ = 0.


References

{{reflist Covariance and correlation Matrices