Law Of Total Cumulance
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and
mathematical Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, the law of total cumulance is a generalization to
cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will ha ...
s of the
law of total probability In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. It expresses the total probability of an outcome which can be realized via several distinct even ...
, the
law of total expectation The proposition in probability theory known as the law of total expectation, the law of iterated expectations (LIE), Adam's law, the tower rule, and the smoothing theorem, among other names, states that if X is a random variable whose expected v ...
, and the
law of total variance In probability theory, the law of total variance or variance decomposition formula or conditional variance formulas or law of iterated variances also known as Eve's law, states that if X and Y are random variables on the same probability space, and ...
. It has applications in the analysis of
time series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Exa ...
. It was introduced by
David Brillinger David Ross Brillinger (born 1937) is a statistician and Emeritus Professor of Statistics at the University of California, Berkeley. He received his PhD from Princeton in 1961 under John Tukey. Brillinger's former doctoral students include Peter ...
.David Brillinger, "The calculation of cumulants via conditioning", ''Annals of the Institute of Statistical Mathematics'', Vol. 21 (1969), pp. 215–218. It is most transparent when stated in its most general form, for ''joint'' cumulants, rather than for cumulants of a specified order for just one
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
. In general, we have : \kappa(X_1,\dots,X_n)=\sum_\pi \kappa(\kappa(X_i : i\in B \mid Y) : B \in \pi), where * ''κ''(''X''1, ..., ''X''''n'') is the joint cumulant of ''n'' random variables ''X''1, ..., ''X''''n'', and * the sum is over all
partitions Partition may refer to: Computing Hardware * Disk partitioning, the division of a hard disk drive * Memory partition, a subdivision of a computer's memory, usually for use by a single job Software * Partition (database), the division of a ...
\pi of the set of indices, and * "''B'' ∈ ;" means ''B'' runs through the whole list of "blocks" of the partition , and * ''κ''(''X''''i'' : ''i'' ∈ ''B'' ,  ''Y'') is a conditional cumulant given the value of the random variable ''Y''. It is therefore a random variable in its own right—a function of the random variable ''Y''.


Examples


The special case of just one random variable and ''n'' = 2 or 3

Only in case ''n'' = either 2 or 3 is the ''n''th cumulant the same as the ''n''th
central moment In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random ...
. The case ''n'' = 2 is well-known (see
law of total variance In probability theory, the law of total variance or variance decomposition formula or conditional variance formulas or law of iterated variances also known as Eve's law, states that if X and Y are random variables on the same probability space, and ...
). Below is the case ''n'' = 3. The notation ''μ''3 means the third central moment. :\mu_3(X)= \operatorname(\mu_3(X\mid Y))+\mu_3(\operatorname(X\mid Y)) +3\operatorname(\operatorname(X\mid Y),\operatorname(X\mid Y)).


General 4th-order joint cumulants

For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows: : \begin & \kappa(X_1,X_2,X_3,X_4) \\ pt= & \kappa(\kappa(X_1,X_2,X_3,X_4\mid Y)) \\ pt& \left.\begin & +\kappa(\kappa(X_1,X_2,X_3\mid Y),\kappa(X_4\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_2,X_4\mid Y),\kappa(X_3\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_3,X_4\mid Y),\kappa(X_2\mid Y)) \\ pt& +\kappa(\kappa(X_2,X_3,X_4\mid Y),\kappa(X_1\mid Y)) \end\right\}(\text 3+1 \text) \\ pt& \left.\begin & +\kappa(\kappa(X_1,X_2\mid Y),\kappa(X_3,X_4\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_3\mid Y),\kappa(X_2,X_4\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_4\mid Y),\kappa(X_2,X_3\mid Y))\end\right\}(\text 2+2 \text) \\ pt& \left.\begin & +\kappa(\kappa(X_1,X_2\mid Y),\kappa(X_3\mid Y),\kappa(X_4\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_3\mid Y),\kappa(X_2\mid Y),\kappa(X_4\mid Y)) \\ pt& +\kappa(\kappa(X_1,X_4\mid Y),\kappa(X_2\mid Y),\kappa(X_3\mid Y)) \\ pt& +\kappa(\kappa(X_2,X_3\mid Y),\kappa(X_1\mid Y),\kappa(X_4\mid Y)) \\ pt& +\kappa(\kappa(X_2,X_4\mid Y),\kappa(X_1\mid Y),\kappa(X_3\mid Y)) \\ pt& +\kappa(\kappa(X_3,X_4\mid Y),\kappa(X_1\mid Y),\kappa(X_2\mid Y)) \end\right\}(\text 2+1+1 \text) \\ pt& \begin +\kappa(\kappa(X_1\mid Y),\kappa(X_2\mid Y),\kappa(X_3\mid Y),\kappa(X_4\mid Y)). \end \end


Cumulants of compound Poisson random variables

Suppose ''Y'' has a
Poisson distribution In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known co ...
with
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
 ''λ'', and ''X'' is the sum of ''Y'' copies of ''W'' that are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
of each other and of ''Y''. :X=\sum_^Y W_y. All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to ''λ''. Also recall that if random variables ''W''1, ..., ''W''''m'' are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
, then the ''n''th cumulant is additive: :\kappa_n(W_1+\cdots+W_m)=\kappa_n(W_1)+\cdots+\kappa_n(W_m). We will find the 4th cumulant of ''X''. We have: : \begin \kappa_4(X) = & \kappa(X,X,X,X) \\ pt= &\kappa_1(\kappa_4(X\mid Y))+4\kappa(\kappa_3(X\mid Y),\kappa_1(X\mid Y))+3\kappa_2(\kappa_2(X\mid Y)) \\ & +6\kappa(\kappa_2(X\mid Y),\kappa_1(X\mid Y),\kappa_1(X\mid Y))+\kappa_4(\kappa_1(X\mid Y)) \\ pt= & \kappa_1(Y\kappa_4(W))+4\kappa(Y\kappa_3(W),Y\kappa_1(W)) +3\kappa_2(Y\kappa_2(W)) \\ & +6\kappa(Y\kappa_2(W),Y\kappa_1(W),Y\kappa_1(W)) +\kappa_4(Y\kappa_1(W)) \\ pt= & \kappa_4(W)\kappa_1(Y)+4\kappa_3(W)\kappa_1(W)\kappa_2(Y) +3\kappa_2(W)^2 \kappa_2(Y) \\ & +6\kappa_2(W) \kappa_1(W)^2 \kappa_3(Y)+\kappa_1(W)^4 \kappa_4(Y) \\ pt= & \kappa_4(W)\lambda + 4\kappa_3(W)\kappa_1(W)\lambda + 3\kappa_2(W)^2+6\kappa_2(W) \kappa_1(W)^2 \lambda + \kappa_1(W)^4\lambda \\ pt= & \lambda \operatorname E(W^4) \qquad\text \end We recognize the last sum as the sum over all partitions of the set , of the product over all blocks of the partition, of cumulants of ''W'' of order equal to the size of the block. That is precisely the 4th raw
moment Moment or Moments may refer to: * Present time Music * The Moments, American R&B vocal group Albums * ''Moment'' (Dark Tranquillity album), 2020 * ''Moment'' (Speed album), 1998 * ''Moments'' (Darude album) * ''Moments'' (Christine Guldbrand ...
of ''W'' (see
cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will ha ...
for a more leisurely discussion of this fact). Hence the cumulants of ''X'' are the moments of ''W'' multiplied by ''λ''. In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
is not a moment sequence of any probability distribution).


Conditioning on a Bernoulli random variable

Suppose ''Y'' = 1 with probability ''p'' and ''Y'' = 0 with probability ''q'' = 1 − ''p''. Suppose the conditional probability distribution of ''X'' given ''Y'' is ''F'' if ''Y'' = 1 and ''G'' if ''Y'' = 0. Then we have :\kappa_n(X)=p\kappa_n(F)+q\kappa_n(G)+\sum_ \kappa_(Y)\prod_ (\kappa_(F)-\kappa_(G)) where \pi<\widehat means is a partition of the set that is finer than the coarsest partition – the sum is over all partitions except that one. For example, if ''n'' = 3, then we have :\kappa_3(X)=p\kappa_3(F)+q\kappa_3(G)+3pq(\kappa_2(F)-\kappa_2(G))(\kappa_1(F)-\kappa_1(G))+pq(q-p)(\kappa_1(F)-\kappa_1(G))^3.


References

{{DEFAULTSORT:Law Of Total Cumulance Algebra of random variables Theory of probability distributions Theorems in statistics Statistical laws