In
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and
statistics, partial correlation measures the degree of
association between two
random variables, with the effect of a set of controlling random variables removed. When determining the numerical relationship between two variables of interest, using their
correlation coefficient
A correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two componen ...
will give
misleading results if there is another
confounding variable that is numerically related to both variables of interest. This misleading information can be avoided by controlling for the confounding variable, which is done by computing the partial correlation coefficient. This is precisely the motivation for including other right-side variables in a
multiple regression; but while multiple regression gives
unbiased results for the
effect size
In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the ...
, it does not give a numerical value of a measure of the strength of the relationship between the two variables of interest.
For example, given
economic
An economy is an area of the production, distribution and trade, as well as consumption of goods and services. In general, it is defined as a social domain that emphasize the practices, discourses, and material expressions associated with t ...
data on the consumption, income, and wealth of various individuals, consider the relationship between consumption and income. Failing to control for wealth when computing a correlation coefficient between consumption and income would give a misleading result, since income might be numerically related to wealth which in turn might be numerically related to consumption; a measured correlation between consumption and income might actually be contaminated by these other correlations. The use of a partial correlation avoids this problem.
Like the correlation coefficient, the partial correlation coefficient takes on a value in the range from –1 to 1. The value –1 conveys a perfect negative correlation controlling for some variables (that is, an exact linear relationship in which higher values of one variable are associated with lower values of the other); the value 1 conveys a perfect positive linear relationship, and the value 0 conveys that there is no linear relationship.
The partial correlation coincides with the
conditional correlation In probability theory, the law of total covariance, covariance decomposition formula, or conditional covariance formula states that if ''X'', ''Y'', and ''Z'' are random variables on the same probability space, and the covariance of ''X'' and ''Y'' ...
if the random variables are
jointly distributed as the
multivariate normal, other
elliptical,
multivariate hypergeometric,
multivariate negative hypergeometric,
multinomial, or
Dirichlet distribution, but not in general otherwise.
[
]
Formal definition
Formally, the partial correlation between ''X'' and ''Y'' given a set of ''n'' controlling variables Z = , written ''ρ''''XY''·Z, is the correlation
In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statisti ...
between the residuals ''e''''X'' and ''e''''Y'' resulting from the linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is ...
of ''X'' with Z and of ''Y'' with Z, respectively. The first-order partial correlation (i.e., when ''n'' = 1) is the difference between a correlation and the product of the removable correlations divided by the product of the coefficients of alienation of the removable correlations. The coefficient of alienation, and its relation with joint variance through correlation are available in Guilford (1973, pp. 344–345).
Computation
Using linear regression
A simple way to compute the sample partial correlation for some data is to solve the two associated linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is ...
problems and calculate the correlation
In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statisti ...
between the residuals. Let ''X'' and ''Y'' be random variables taking real values, and let Z be the ''n''-dimensional vector-valued random variable. Let ''xi'', ''yi'' and z''i'' denote the ''i''th of i.i.d. observations from some joint probability distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considere ...
over real random variables ''X'', ''Y'', and Z, with z''i'' having been augmented with a 1 to allow for a constant term in the regression. Solving the linear regression problem amounts to finding (''n''+1)-dimensional regression coefficient vectors and such that
:
:
where is the number of observations, and is the scalar product between the vectors and .
The residuals are then
:
:
and the sample partial correlation is then given by the usual formula for sample correlation, but between these new ''derived'' values:
:
In the first expression the three terms after minus signs all equal 0 since each contains the sum of residuals from an ordinary least squares
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the ...
regression.
Example
Consider the following data on three variables, ''X'', ''Y'', and ''Z'':
Computing the Pearson correlation coefficient
In statistics, the Pearson correlation coefficient (PCC, pronounced ) ― also known as Pearson's ''r'', the Pearson product-moment correlation coefficient (PPMCC), the bivariate correlation, or colloquially simply as the correlation coefficien ...
between variables ''X'' and ''Y'' results in approximately 0.970, while computing the partial correlation between ''X'' and ''Y'', using the formula given above, gives a partial correlation of 0.919. The computations were done using R with the following code.
> X <- c(2,4,15,20)
> Y <- c(1,2,3,4)
> Z <- c(0,0,1,1)
> mm1 <- lm(X~Z)
> res1 <- mm1$residuals
> mm2 <- lm(Y~Z)
> res2 <- mm2$residuals
> cor(res1,res2)
0.919145
> cor(X,Y)
0.9695016
> generalCorr::parcorMany(cbind(X,Y,Z))
nami namj partij partji rijMrji
,"X" "Y" "0.8844" "1" "-0.1156"
,"X" "Z" "0.1581" "1" "-0.8419"
The lower part of the above code reports generalized nonlinear partial correlation coefficient between ''X'' and ''Y'' after removing the nonlinear effect of ''Z'' to be 0.8844. Also, the generalized partial correlation coefficient between ''X'' and ''Z'' after removing the nonlinear effect of ''Y'' to be 0.1581. See the R package `generalCorr' and its vignettes for details. Simulation and other details are in Vinod (2017) "Generalized correlation and kernel causality with applications in development economics," Communications in Statistics - Simulation and Computation, vol. 46, 513, 4534 available online: 29 Dec 2015, URL https://doi.org/10.1080/03610918.2015.1122048.
Using recursive formula
It can be computationally expensive to solve the linear regression problems. Actually, the ''n''th-order partial correlation (i.e., with , Z, = ''n'') can be easily computed from three (''n'' - 1)th-order partial correlations. The zeroth-order partial correlation ''ρ''''XY''·Ø is defined to be the regular correlation coefficient
A correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two componen ...
''ρ''''XY''.
It holds, for any that
:
Naïvely implementing this computation as a recursive algorithm yields an exponential time complexity
Complexity characterises the behaviour of a system or model whose components interact in multiple ways and follow local rules, leading to nonlinearity, randomness, collective dynamics, hierarchy, and emergence.
The term is generally used to c ...
. However, this computation has the overlapping subproblems In computer science, a problem is said to have overlapping subproblems if the problem can be broken down into subproblems which are reused several times or a recursive algorithm for the problem solves the same subproblem over and over rather than al ...
property, such that using dynamic programming
Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics.
I ...
or simply caching the results of the recursive calls yields a complexity of .
Note in the case where ''Z'' is a single variable, this reduces to:
:
Using matrix inversion
The partial correlation can also be written in terms of the joint precision matrix. Consider a set of random variables, of cardinality ''n''. We want the partial correlation between two variables and given all others, i.e., . Suppose the (joint/full) covariance matrix is positive definite and therefore invertible. If the precision matrix is defined as , then
Computing this requires , the inverse of the covariance matrix which runs in time (using the sample covariance matrix to obtain a sample partial correlation). Note that only a single matrix inversion is required to give ''all'' the partial correlations between pairs of variables in .
To prove Equation (), return to the previous notation (i.e. ) and start with the definition of partial correlation: ''ρ''''XY''·Z is the correlation
In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statisti ...
between the residuals ''e''''X'' and ''e''''Y'' resulting from the linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is ...
of ''X'' with Z and of ''Y'' with Z, respectively.
First, suppose are the coefficients for linear regression fit; that is,
:
:
Write the joint covariance matrix for the vector as
:
whereThen the standard formula for linear regression gives
:
Hence, the residuals can be written as
:
Note that has expectation zero because of the inclusion of an intercept term in . Computing the covariance
In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the le ...
now gives
Next, write the precision matrix in a similar block form:
:
Then, by Schur's formula for block-matrix inversion,
:
The entries of the right-hand-side matrix are precisely the covariances previously computed in (), giving
:
Using the formula for the inverse of a 2×2 matrix gives
:
So indeed, the partial correlation is
:
as claimed in ().
Interpretation
Geometrical
Let three variables ''X'', ''Y'', ''Z'' (where ''Z'' is the "control" or "extra variable") be chosen from a joint probability distribution over ''n'' variables V. Further, let v''i'', 1 ≤ ''i'' ≤ ''N'', be ''N'' ''n''-dimensional i.i.d. observations taken from the joint probability distribution over V. The geometrical interpretation comes from considering the ''N''-dimensional vectors x (formed by the successive values of ''X'' over the observations), y (formed by the values of ''Y''), and z (formed by the values of ''Z'').
It can be shown that the residuals ''eX,i'' coming from the linear regression of ''X'' on Z, if also considered as an ''N''-dimensional vector e''X'' (denoted r''X'' in the accompanying graph), have a zero scalar product with the vector z generated by Z. This means that the residuals vector lies on an (''N''–1)-dimensional hyperplane
In geometry, a hyperplane is a subspace whose dimension is one less than that of its '' ambient space''. For example, if a space is 3-dimensional then its hyperplanes are the 2-dimensional planes, while if the space is 2-dimensional, its hype ...
''S''z that is perpendicular
In elementary geometry, two geometric objects are perpendicular if they intersect at a right angle (90 degrees or π/2 radians). The condition of perpendicularity may be represented graphically using the ''perpendicular symbol'', ⟂. It can ...
to z.
The same also applies to the residuals ''eY,i'' generating a vector e''Y''. The desired partial correlation is then the cosine of the angle ''φ'' between the projections e''X'' and e''Y'' of x and y, respectively, onto the hyperplane perpendicular to z.
As conditional independence test
With the assumption that all involved variables are multivariate Gaussian
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
, the partial correlation ''ρ''''XY''·Z is zero if and only if ''X'' is conditionally independent from ''Y'' given Z. This property does not hold in the general case.
To test
Test(s), testing, or TEST may refer to:
* Test (assessment), an educational assessment intended to measure the respondents' knowledge or other abilities
Arts and entertainment
* ''Test'' (2013 film), an American film
* ''Test'' (2014 film), ...
if a sample partial correlation implies that the true population partial correlation differs from 0, Fisher's ''z-transform of the partial correlation'' can be used:
:
The null hypothesis
In scientific research, the null hypothesis (often denoted ''H''0) is the claim that no difference or relationship exists between two sets of data or variables being analyzed. The null hypothesis is that any experimentally observed difference is d ...
is , to be tested against the two-tail alternative . can be rejected if
:
where is the cumulative distribution function
In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x.
Ev ...
of a Gaussian distribution
In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is
:
f(x) = \frac e^
The parameter \mu i ...
with zero mean
There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value ( magnitude and sign) of a given data set.
For a data set, the '' ari ...
and unit standard deviation, is the significance level of , and '''' is the sample size. This ''z''-transform is approximate, and the actual distribution of the sample (partial) correlation coefficient is not straightforward. However, an exact t-test
A ''t''-test is any statistical hypothesis test in which the test statistic follows a Student's ''t''-distribution under the null hypothesis. It is most commonly applied when the test statistic would follow a normal distribution if the value of a ...
based on a combination of the partial regression coefficient, the partial correlation coefficient, and the partial variances is available.
The distribution of the sample partial correlation was described by Fisher.
Semipartial correlation (part correlation)
The semipartial (or part) correlation statistic is similar to the partial correlation statistic; both compare variations of two variables after certain factors are controlled for. However, to calculate the semipartial correlation, one holds the third variable constant for either ''X'' or ''Y'' but not both; whereas for the partial correlation, one holds the third variable constant for both. The semipartial correlation compares the unique variation of one variable (having removed variation associated with the ''Z'' variable(s)) with the unfiltered variation of the other, while the partial correlation compares the unique variation of one variable to the unique variation of the other.
The semipartial correlation can be viewed as more practically relevant "because it is scaled to (i.e., relative to) the total variability in the dependent (response) variable."[StatSoft, Inc. (2010)]
"Semi-Partial (or Part) Correlation"
Electronic Statistics Textbook. Tulsa, OK: StatSoft, accessed January 15, 2011. Conversely, it is less theoretically useful because it is less precise about the role of the unique contribution of the independent variable.
The absolute value of the semipartial correlation of ''X'' with ''Y'' is always less than or equal to that of the partial correlation of ''X'' with ''Y''. The reason is this: Suppose the correlation of ''X'' with ''Z'' has been removed from ''X'', giving the residual vector ''e''''x'' . In computing the semipartial correlation, ''Y'' still contains both unique variance and variance due to its association with ''Z''. But ''e''''x'' , being uncorrelated with ''Z'', can only explain some of the unique part of the variance of ''Y'' and not the part related to ''Z''. In contrast, with the partial correlation, only ''e''''y'' (the part of the variance of ''Y'' that is unrelated to ''Z'') is to be explained, so there is less variance of the type that ''e''''x'' cannot explain.
Use in time series analysis
In time series analysis
In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. ...
, the partial autocorrelation function (sometimes "partial correlation function") of a time series is defined, for lag '''', as
:
This function is used to determine the appropriate lag length for an autoregression.
See also
* Linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is ...
* Conditional independence
* Multiple correlation
References
External links
*
* Mathematical formulae in the "Description" section of th
IMSL Numerical Library PCORR routine
*
{{DEFAULTSORT:Partial Correlation
Covariance and correlation
Autocorrelation
Articles with example R code