dual total correlation
   HOME

TheInfoList



OR:

In information theory, dual total correlation, information rate, excess entropy,Nihat Ay, E. Olbrich, N. Bertschinger (2001). A unifying framework for complexity measures of finite systems. European Conference on Complex Systems
pdf
or binding information is one of several known non-negative generalizations of mutual information. While
total correlation In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the ''multivariate constraint'' (Garner 1962) or ''multiinformation'' ...
is bounded by the sum entropies of the ''n'' elements, the dual total correlation is bounded by the joint-entropy of the ''n'' elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation.


Definition

For a set of ''n''
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
s \, the dual total correlation D(X_1,\ldots, X_n) is given by : D(X_1,\ldots, X_n) = H\left( X_1, \ldots, X_n \right) - \sum_^n H\left( X_i \mid X_1, \ldots, X_, X_, \ldots, X_n \right) , where H(X_,\ldots, X_) is the joint entropy of the variable set \ and H(X_i \mid \cdots ) is the conditional entropy of variable X_, given the rest.


Normalized

The dual total correlation normalized between [0,1] is simply the dual total correlation divided by its maximum value H(X_, \ldots, X_), :ND(X_1,\ldots, X_n) = \frac .


Relationship with Total Correlation

Dual total correlation is non-negative and bounded above by the joint entropy H(X_1, \ldots, X_n). : 0 \leq D(X_1, \ldots, X_n) \leq H(X_1, \ldots, X_n) . Secondly, Dual total correlation has a close relationship with total correlation, C(X_1, \ldots, X_n), and can be written in terms of differences between the total correlation of the whole, and all subsets of size N-1: : D(\textbf) = (N-1)C(\textbf) - \sum_^ C(\textbf^) where \textbf = \ and \textbf^ = \ Furthermore, the total correlation and dual total correlation are related by the following bounds: : \frac \leq D(X_1, \ldots, X_n) \leq (n-1) \; C(X_1, \ldots, X_n) . Finally, the difference between the total correlation and the dual total correlation defines a novel measure of higher-order information-sharing: the O-information: :\Omega(\textbf) = C(\textbf) - D(\textbf) . The O-information (first introduced as the "enigmatic information" by James and Crutchfield is a signed measure that quantifies the extent to which the information in a multivariate random variable is dominated by synergistic interactions (in which case \Omega(\textbf)<0) or redundant interactions (in which case \Omega(\textbf) > 0.


History

Han (1978) originally defined the dual total correlation as, : \begin & D(X_1,\ldots, X_n) \\[10pt] \equiv & \left[ \sum_^n H(X_1, \ldots, X_, X_, \ldots, X_n ) \right] - (n-1) \; H(X_1, \ldots, X_n) \; . \end However Abdallah and Plumbley (2010) showed its equivalence to the easier-to-understand form of the joint entropy minus the sum of conditional entropies via the following: : \begin & D(X_1,\ldots, X_n) \\[10pt] \equiv & \left[ \sum_^n H(X_1, \ldots, X_, X_, \ldots, X_n ) \right] - (n-1) \; H(X_1, \ldots, X_n) \\ = & \left[ \sum_^n H(X_1, \ldots, X_, X_, \ldots, X_n ) \right] + (1-n) \; H(X_1, \ldots, X_n) \\ = & H(X_1, \ldots, X_n) + \left[ \sum_^n H(X_1, \ldots, X_, X_, \ldots, X_n ) - H(X_1, \ldots, X_n) \right] \\ = & H\left( X_1, \ldots, X_n \right) - \sum_^n H\left( X_i \mid X_1, \ldots, X_, X_, \ldots, X_n \right)\; . \end


See also

*Interaction information *Mutual information *Total correlation


Bibliography


Footnotes


References

* * {{cite journal , doi=10.1038/s42003-023-04843-w, title=Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex, year=2023, last1=Varley, first1=Thomas, last2=Pope, first2=Maria, last3=Faskowitz, first3=Joshua, last4=Sporns, first4=Olaf, journal= Communications Biology, volume=6 , page=451 , doi-access=free, pmid=37095282 , pmc=10125999 Information theory Probability theory Covariance and correlation