HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and statistics, coskewness is a measure of how much three random variables change together. Coskewness is the third standardized cross
central moment In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random ...
, related to
skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimodal ...
as
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
is related to
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
. In 1976, Krauss and Litzenberger used it to examine risk in stock market investments. The application to risk was extended by Harvey and Siddique in 2000. If three random variables exhibit positive coskewness they will tend to undergo extreme deviations at the same time, an odd number of which are in the positive direction (so all three random variables undergoing extreme positive deviations, or one undergoing an extreme positive deviation while the other two undergo extreme negative deviations). Similarly, if three random variables exhibit negative coskewness they will tend to undergo extreme deviations at the same time, an even number of which are in the positive direction (so all three random variables undergoing extreme negative deviations, or one undergoing an extreme negative deviation while the other two undergo extreme positive deviations).


Definition

For three random variables ''X'', ''Y'' and ''Z'', the non-trivial coskewness statistic is defined as: : S(X,Y,Z) = \frac where E 'X''is the expected value of ''X'', also known as the mean of ''X'', and \sigma_X is the standard deviation of ''X''.


Properties

Skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimodal ...
is a special case of the coskewness when the three random variables are identical: : S(X,X,X) = \frac = , For two random variables, ''X'' and ''Y'', the
skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimodal ...
of the sum, ''X'' + ''Y'', is : S_ = , where ''S''''X'' is the
skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimodal ...
of ''X'' and \sigma_X is the standard deviation of ''X''. It follows that the sum of two random variables can be skewed (''S''''X''+''Y'' ≠ 0) even if both random variables have zero skew in isolation (''S''''X'' = 0 and ''S''''Y'' = 0). The coskewness between variables ''X'' and ''Y'' does not depend on the scale on which the variables are expressed. If we are analyzing the relationship between ''X'' and ''Y'', the coskewness between ''X'' and ''Y'' will be the same as the coskewness between ''a'' + ''bX'' and ''c'' + ''dY'', where ''a'', ''b'', ''c'', and ''d'' are constants.


Example

Let ''X'' be standard normally distributed and ''Y'' be the distribution obtained by setting ''X''=''Y'' whenever ''X''<0 and drawing ''Y'' independently from a standard
half-normal distribution In probability theory and statistics, the half-normal distribution is a special case of the folded normal distribution. Let X follow an ordinary normal distribution, N(0,\sigma^2). Then, Y=, X, follows a half-normal distribution. Thus, the ha ...
whenever ''X''>0. In other words, ''X'' and ''Y'' are both standard normally distributed with the property that they are completely correlated for negative values and uncorrelated apart from sign for positive values. The joint probability density function is :f_(x,y) = \frac \left(H(-x)\delta(x-y) + 2H(x)H(y) \frac\right) where ''H''(''x'') is the
Heaviside step function The Heaviside step function, or the unit step function, usually denoted by or (but sometimes , or ), is a step function, named after Oliver Heaviside (1850–1925), the value of which is zero for negative arguments and one for positive argum ...
and δ(''x'') is the Dirac delta function. The third moments are easily calculated by integrating with respect to this density: :S(X,X,Y) = S(X,Y,Y) = -\frac \approx -0.399 Note that although ''X'' and ''Y'' are individually standard normally distributed, the distribution of the sum ''X''+''Y'' is significantly skewed. From integration with respect to density, we find that the covariance of ''X'' and ''Y'' is :\operatorname(X,Y) = \frac + \frac from which it follows that the standard deviation of their sum is :\sigma_ = \sqrt Using the skewness sum formula above, we have :S_ = -\frac \approx -0.345 This can also be computed directly from the probability density function of the sum: :f_(u) = \frac H(-u) + \frac \operatorname\left(\frac\right) H(u)


See also

* Moment (mathematics) *
Cokurtosis In probability theory and statistics, cokurtosis is a measure of how much two random variables change together. Cokurtosis is the fourth standardized cross central moment. If two random variables exhibit a high level of cokurtosis they will tend to ...


References


Further reading

* * {{cite journal , last1=Kraus , first1=Alan , author2=Robert H. Litzenberger , title=Skewness Preference and the Valuation of Risk Assets , journal=The Journal of Finance , year=1976 , volume=31 , issue=4 , pages=1085–1100 , doi=10.1111/j.1540-6261.1976.tb01961.x Algebra of random variables Theory of probability distributions Covariance and correlation