Partial Correlation
   HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, partial correlation measures the degree of association between two
random variables A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers ...
, with the effect of a set of controlling random variables removed. When determining the numerical relationship between two variables of interest, using their
correlation coefficient A correlation coefficient is a numerical measure of some type of linear correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two c ...
will give misleading results if there is another
confounding variable In causal inference, a confounder is a variable that influences both the dependent variable and independent variable, causing a spurious association. Confounding is a causal concept, and as such, cannot be described in terms of correlati ...
that is numerically related to both variables of interest. This misleading information can be avoided by controlling for the confounding variable, which is done by computing the partial correlation coefficient. This is precisely the motivation for including other right-side variables in a multiple regression; but while multiple regression gives
unbiased Bias is a disproportionate weight ''in favor of'' or ''against'' an idea or thing, usually in a way that is inaccurate, closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individ ...
results for the
effect size In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the ...
, it does not give a numerical value of a measure of the strength of the relationship between the two variables of interest. For example, given
economic An economy is an area of the Production (economics), production, Distribution (economics), distribution and trade, as well as Consumption (economics), consumption of Goods (economics), goods and Service (economics), services. In general, it is ...
data on the consumption, income, and wealth of various individuals, consider the relationship between consumption and income. Failing to control for wealth when computing a correlation coefficient between consumption and income would give a misleading result, since income might be numerically related to wealth which in turn might be numerically related to consumption; a measured correlation between consumption and income might actually be contaminated by these other correlations. The use of a partial correlation avoids this problem. Like the correlation coefficient, the partial correlation coefficient takes on a value in the range from –1 to 1. The value –1 conveys a perfect negative correlation controlling for some variables (that is, an exact linear relationship in which higher values of one variable are associated with lower values of the other); the value 1 conveys a perfect positive linear relationship, and the value 0 conveys that there is no linear relationship. The partial correlation coincides with the conditional correlation if the random variables are jointly distributed as the multivariate normal, other elliptical, multivariate hypergeometric, multivariate negative hypergeometric, multinomial, or
Dirichlet distribution In probability and statistics, the Dirichlet distribution (after Peter Gustav Lejeune Dirichlet), often denoted \operatorname(\boldsymbol\alpha), is a family of continuous multivariate probability distributions parameterized by a vector of pos ...
, but not in general otherwise.


Formal definition

Formally, the partial correlation between ''X'' and ''Y'' given a set of ''n'' controlling variables Z = , written ''ρ''''XY''·Z, is the
correlation In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics ...
between the residuals ''e''''X'' and ''e''''Y'' resulting from the
linear regression In statistics, linear regression is a statistical model, model that estimates the relationship between a Scalar (mathematics), scalar response (dependent variable) and one or more explanatory variables (regressor or independent variable). A mode ...
of ''X'' with Z and of ''Y'' with Z, respectively. The first-order partial correlation (i.e., when ''n'' = 1) is the difference between a correlation and the product of the removable correlations divided by the product of the coefficients of alienation of the removable correlations. The
coefficient of alienation In mathematics, a coefficient is a multiplicative factor involved in some term of a polynomial, a series, or any other type of expression. It may be a number without units, in which case it is known as a numerical factor. It may also be a ...
, and its relation with joint variance through correlation are available in Guilford (1973, pp. 344–345).


Computation


Using linear regression

A simple way to compute the sample partial correlation for some data is to solve the two associated
linear regression In statistics, linear regression is a statistical model, model that estimates the relationship between a Scalar (mathematics), scalar response (dependent variable) and one or more explanatory variables (regressor or independent variable). A mode ...
problems and calculate the
correlation In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics ...
between the residuals. Let ''X'' and ''Y'' be random variables taking real values, and let Z be the ''n''-dimensional vector-valued random variable. Let ''xi'', ''yi'' and z''i'' denote the ''i''th of N i.i.d. observations from some
joint probability distribution A joint or articulation (or articular surface) is the connection made between bones, ossicles, or other hard structures in the body which link an animal's skeletal system into a functional whole.Saladin, Ken. Anatomy & Physiology. 7th ed. McGraw- ...
over real random variables ''X'', ''Y'', and Z, with z''i'' having been augmented with a 1 to allow for a constant term in the regression. Solving the linear regression problem amounts to finding (''n''+1)-dimensional regression coefficient vectors \mathbf_X^* and \mathbf_Y^* such that : \mathbf_X^* = \arg\min_ \left\ : \mathbf_Y^* = \arg\min_ \left\ where N is the number of observations, and \langle\mathbf, \mathbf_i \rangle is the
scalar product In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used for other symmetric bilinear forms, for example in a pseudo-Euclidean space. Not to be confused wit ...
between the vectors \mathbf and \mathbf_i. The residuals are then :e_ = x_i - \langle\mathbf_X^*,\mathbf_i \rangle :e_ = y_i - \langle\mathbf_Y^*,\mathbf_i \rangle and the sample partial correlation is then given by the usual formula for sample correlation, but between these new ''derived'' values: :\begin \hat_&=\frac \\ &=\frac . \end In the first expression the three terms after minus signs all equal 0 since each contains the sum of residuals from an
ordinary least squares In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression In statistics, linear regression is a statistical model, model that estimates the relationship ...
regression.


Example

Consider the following data on three variables, ''X'', ''Y'', and ''Z'': Computing the
Pearson correlation coefficient In statistics, the Pearson correlation coefficient (PCC) is a correlation coefficient that measures linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviatio ...
between variables ''X'' and ''Y'' results in approximately 0.970, while computing the partial correlation between ''X'' and ''Y'', using the formula given above, gives a partial correlation of 0.919. The computations were done using R with the following code. > X <- c(2,4,15,20) > Y <- c(1,2,3,4) > Z <- c(0,0,1,1) > mm1 <- lm(X~Z) > res1 <- mm1$residuals > mm2 <- lm(Y~Z) > res2 <- mm2$residuals > cor(res1,res2) 0.919145 > cor(X,Y) 0.9695016 > generalCorr::parcorMany(cbind(X,Y,Z)) nami namj partij partji rijMrji ,"X" "Y" "0.8844" "1" "-0.1156" ,"X" "Z" "0.1581" "1" "-0.8419" The lower part of the above code reports generalized nonlinear partial correlation coefficient between ''X'' and ''Y'' after removing the nonlinear effect of ''Z'' to be 0.8844. Also, the generalized nonlinear partial correlation coefficient between ''X'' and ''Z'' after removing the nonlinear effect of ''Y'' to be 0.1581. See the R package `generalCorr' and its vignettes for details. Simulation and other details are in Vinod (2017) "Generalized correlation and kernel causality with applications in development economics," Communications in Statistics - Simulation and Computation, vol. 46, 513, 4534 available online: 29 Dec 2015, URL https://doi.org/10.1080/03610918.2015.1122048.


Using recursive formula

It can be computationally expensive to solve the linear regression problems. Actually, the ''n''th-order partial correlation (i.e., with , Z, = ''n'') can be easily computed from three (''n'' - 1)th-order partial correlations. The zeroth-order partial correlation ''ρ''''XY''·Ø is defined to be the regular
correlation coefficient A correlation coefficient is a numerical measure of some type of linear correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two c ...
''ρ''''XY''. It holds, for any Z_0 \in \mathbf, that :\rho_ = \frac Naïvely implementing this computation as a
recursive algorithm In computer science, recursion is a method of solving a computational problem where the solution depends on solutions to smaller instances of the same problem. Recursion solves such recursive problems by using functions that call themselves ...
yields an exponential time
complexity Complexity characterizes the behavior of a system or model whose components interact in multiple ways and follow local rules, leading to non-linearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to c ...
. However, this computation has the
overlapping subproblems In computer science, a Computational problem, problem is said to have overlapping subproblems if the problem can be broken down into subproblems which are reused several times or a recursive algorithm for the problem solves the same subproblem over ...
property, such that using dynamic programming or simply caching the results of the recursive calls yields a complexity of \mathcal(n^3). Note in the case where ''Z'' is a single variable, this reduces to: :\rho_= \frac


Using matrix inversion

The partial correlation can also be written in terms of the joint precision matrix. Consider a set of random variables, \mathbf = of cardinality ''n''. We want the partial correlation between two variables X_i and X_j given all others, i.e., \mathbf \setminus \. Suppose the (joint/full)
covariance matrix In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of ...
\Sigma = (\sigma_) is positive definite and therefore
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
. If the
precision matrix In statistics, the precision matrix or concentration matrix is the matrix inverse of the covariance matrix or dispersion matrix, P = \Sigma^. For univariate distributions, the precision matrix degenerates into a scalar precision, defined as the ...
is defined as \Omega = (p_) = \Sigma^, then Computing this requires \Sigma^, the inverse of the covariance matrix \Sigma which runs in \mathcal(n^3) time (using the sample covariance matrix to obtain a sample partial correlation). Note that only a single matrix inversion is required to give ''all'' the partial correlations between pairs of variables in \mathbf. To prove Equation (), return to the previous notation (i.e. X,Y,\mathbf \leftrightarrow X_i,X_j, \mathbf \setminus \) and start with the definition of partial correlation: ''ρ''''XY''·Z is the
correlation In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics ...
between the residuals ''e''''X'' and ''e''''Y'' resulting from the
linear regression In statistics, linear regression is a statistical model, model that estimates the relationship between a Scalar (mathematics), scalar response (dependent variable) and one or more explanatory variables (regressor or independent variable). A mode ...
of ''X'' with Z and of ''Y'' with Z, respectively. First, suppose \beta,\gamma are the coefficients for linear regression fit; that is, :\beta = \operatorname_\beta \mathbb \, X - \beta ^T Z\, ^2 :\gamma = \operatorname_\gamma \mathbb \, Y - \gamma ^T Z\, ^2 Write the joint covariance matrix for the vector (X,Y,Z^T)^T as : \Sigma = \begin \Sigma_ & \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ & \Sigma_ \end = \begin C_ & C_ \\ C_ & C_ \\ \end whereC_ = \begin \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ \end, \qquad C_ = \begin \Sigma_ \\ \Sigma_ \end, \qquad C_ = \begin \Sigma_ & \Sigma_ \end, \qquad C_ = \Sigma_ Then the standard formula for linear regression gives : \beta = \left(\Sigma_\right)^ \Sigma_ Hence, the residuals can be written as : R_X = X - \beta^T Z = X - \Sigma_ \left(\Sigma_\right)^ Z Note that R_X has expectation zero because of the inclusion of an intercept term in Z. Computing the
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. The sign of the covariance, therefore, shows the tendency in the linear relationship between the variables. If greater values of one ...
now gives Next, write the precision matrix \Omega = \Sigma^ in a similar block form: : \Omega = \begin \Omega_ & \Omega_ & \Omega_ \\ \Omega_ & \Omega_ & \Omega_ \\ \Omega_ & \Omega_ & \Omega_ \end = \begin P_ & P_ \\ P_ & P_ \\ \end Then, by Schur's formula for block-matrix inversion, : P_^ = C_ - C_ C_^ C_ The entries of the right-hand-side matrix are precisely the covariances previously computed in (), giving : P_^ = \begin \operatorname(R_X,R_X) & \operatorname(R_X,R_Y) \\ \operatorname(R_Y,R_X) & \operatorname(R_Y,R_Y) \\ \end Using the formula for the inverse of a 2×2 matrix gives : \begin P_^ & = \frac \begin _ & - _ \\ - _ & _ \\ \end \\ & = \frac \begin p_ & -p_ \\ -p_ & p_ \\ \end \end So indeed, the partial correlation is : \rho_ = \frac = \frac = -\frac as claimed in ().


Interpretation


Geometrical

Let three variables ''X'', ''Y'', ''Z'' (where ''Z'' is the "control" or "extra variable") be chosen from a joint probability distribution over ''n'' variables V. Further, let v''i'', 1 ≤ ''i'' ≤ ''N'', be ''N'' ''n''-dimensional i.i.d. observations taken from the joint probability distribution over V. The geometrical interpretation comes from considering the ''N''-dimensional vectors x (formed by the successive values of ''X'' over the observations), y (formed by the values of ''Y''), and z (formed by the values of ''Z''). It can be shown that the residuals ''eX,i'' coming from the linear regression of ''X'' on Z, if also considered as an ''N''-dimensional vector e''X'' (denoted r''X'' in the accompanying graph), have a zero
scalar product In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used for other symmetric bilinear forms, for example in a pseudo-Euclidean space. Not to be confused wit ...
with the vector z generated by Z. This means that the residuals vector lies on an (''N''–1)-dimensional
hyperplane In geometry, a hyperplane is a generalization of a two-dimensional plane in three-dimensional space to mathematical spaces of arbitrary dimension. Like a plane in space, a hyperplane is a flat hypersurface, a subspace whose dimension is ...
''S''z that is
perpendicular In geometry, two geometric objects are perpendicular if they intersect at right angles, i.e. at an angle of 90 degrees or π/2 radians. The condition of perpendicularity may be represented graphically using the '' perpendicular symbol'', ...
to z. The same also applies to the residuals ''eY,i'' generating a vector e''Y''. The desired partial correlation is then the
cosine In mathematics, sine and cosine are trigonometric functions of an angle. The sine and cosine of an acute angle are defined in the context of a right triangle: for the specified angle, its sine is the ratio of the length of the side opposite that ...
of the angle ''φ'' between the projections e''X'' and e''Y'' of x and y, respectively, onto the hyperplane perpendicular to z.


As conditional independence test

With the assumption that all involved variables are
multivariate Gaussian In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One de ...
, the partial correlation ''ρ''''XY''·Z is zero if and only if ''X'' is conditionally independent from ''Y'' given Z. This property does not hold in the general case. To
test Test(s), testing, or TEST may refer to: * Test (assessment), an educational assessment intended to measure the respondents' knowledge or other abilities Arts and entertainment * ''Test'' (2013 film), an American film * ''Test'' (2014 film) ...
if a sample partial correlation \hat_ implies that the true population partial correlation differs from 0, Fisher's ''z-transform of the partial correlation'' can be used: :z(\hat_) = \frac \ln\left(\frac\right) The
null hypothesis The null hypothesis (often denoted ''H''0) is the claim in scientific research that the effect being studied does not exist. The null hypothesis can also be described as the hypothesis in which no relationship exists between two sets of data o ...
is H_0: \rho_ = 0, to be tested against the two-tail alternative H_A: \rho_ \neq 0. H_0 can be rejected if :\sqrt\cdot , z(\hat_), > \Phi^(1-\alpha/2) where \Phi is the
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Ever ...
of a
Gaussian distribution In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real number, real-valued random variable. The general form of its probability density function is f(x ...
with zero
mean A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
and unit
standard deviation In statistics, the standard deviation is a measure of the amount of variation of the values of a variable about its Expected value, mean. A low standard Deviation (statistics), deviation indicates that the values tend to be close to the mean ( ...
, \alpha is the
significance level In statistical hypothesis testing, a result has statistical significance when a result at least as "extreme" would be very infrequent if the null hypothesis were true. More precisely, a study's defined significance level, denoted by \alpha, is the ...
of H_0, and ''N'' is the
sample size Sample size determination or estimation is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences abo ...
. This ''z''-transform is approximate, and the actual distribution of the sample (partial) correlation coefficient is not straightforward. However, an exact
t-test Student's ''t''-test is a statistical test used to test whether the difference between the response of two groups is Statistical significance, statistically significant or not. It is any statistical hypothesis testing, statistical hypothesis test ...
based on a combination of the partial regression coefficient, the partial correlation coefficient, and the partial variances is available. The distribution of the sample partial correlation was described by Fisher.


Semipartial correlation (part correlation)

The semipartial (or part) correlation statistic is similar to the partial correlation statistic; both compare variations of two variables after certain factors are controlled for. However, to calculate the semipartial correlation, one holds the third variable constant for either ''X'' or ''Y'' but not both; whereas for the partial correlation, one holds the third variable constant for both. The semipartial correlation compares the unique variation of one variable (having removed variation associated with the ''Z'' variable(s)) with the unfiltered variation of the other, while the partial correlation compares the unique variation of one variable to the unique variation of the other. The semipartial correlation can be viewed as more practically relevant "because it is scaled to (i.e., relative to) the total variability in the dependent (response) variable." Conversely, it is less theoretically useful because it is less precise about the role of the unique contribution of the independent variable. The absolute value of the semipartial correlation of ''X'' with ''Y'' is always less than or equal to that of the partial correlation of ''X'' with ''Y''. The reason is this: Suppose the correlation of ''X'' with ''Z'' has been removed from ''X'', giving the residual vector ''e''''x'' . In computing the semipartial correlation, ''Y'' still contains both unique variance and variance due to its association with ''Z''. But ''e''''x'' , being uncorrelated with ''Z'', can only explain some of the unique part of the variance of ''Y'' and not the part related to ''Z''. In contrast, with the partial correlation, only ''e''''y'' (the part of the variance of ''Y'' that is unrelated to ''Z'') is to be explained, so there is less variance of the type that ''e''''x'' cannot explain.


Use in time series analysis

In
time series analysis In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. ...
, the
partial autocorrelation function In time series analysis, the partial autocorrelation function (PACF) gives the partial correlation of a stationary time series with its own lagged values, regressed the values of the time series at all shorter lags. It contrasts with the autocorre ...
(sometimes "partial correlation function") of a time series is defined, for lag ''h'', as :\varphi(h)= \rho_ This function is used to determine the appropriate lag length for an autoregression.


Partial correlations with Shrinkage

When the sample size is smaller than the number of variables, a.k.a. high-dimensional setting, estimating partial correlations can be challenging. In this scenario, the sample covariance \hat is not well-conditioned, and finding its inverse \hat turns problematic. Shrinkage_estimation methods improve \hat or \hat and produces more reliable partial correlation estimates. One example is the Ledoit-Wolf shrinkage estimator, :\hat^ = \lambda T + (1 - \lambda) \Sigma where \hat is the sample covariance matrix, T is a target matrix (e.g., a diagonal matrix), and the shrinkage intensity \lambda\in (0,1). The partial correlation under the Ledoit-Wolf shrinkage is then: :\hat_^ = \frac where \hat_^ is the inverse of \hat_^ . This method is used in a variety of fields including finance and genomics.Ledoit, O., & Wolf, M. (2022). The power of (non-) linear shrinking: A review and guide to covariance matrix estimation. ''Journal of Financial Econometrics'', 20(1), 187-218. https://doi.org/10.1093/jjfinec/nbaa007


See also

*
Linear regression In statistics, linear regression is a statistical model, model that estimates the relationship between a Scalar (mathematics), scalar response (dependent variable) and one or more explanatory variables (regressor or independent variable). A mode ...
*
Conditional independence In probability theory, conditional independence describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of conditional probabi ...
*
Multiple correlation In statistics, the coefficient of multiple correlation is a measure of how well a given variable can be predicted using a linear function of a set of other variables. It is the correlation between the variable's values and the best predictions th ...
* Partial information decomposition


References


External links

* * Mathematical formulae in the "Description" section of th
IMSL Numerical Library PCORR routine
*

{{DEFAULTSORT:Partial Correlation Covariance and correlation Autocorrelation Articles with example R code