HOME

TheInfoList



OR:

Transfer entropy is a
non-parametric statistic Nonparametric statistics is the branch of statistics that is not based solely on parametrized families of probability distributions (common examples of parameters are the mean and variance). Nonparametric statistics is based on either being dist ...
measuring the amount of directed (time-asymmetric) transfer of
information Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level information pertains to the interpretation of that which may be sensed. Any natural process that is not completely random ...
between two
random process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appe ...
es. Transfer entropy from a process ''X'' to another process ''Y'' is the amount of uncertainty reduced in future values of ''Y'' by knowing the past values of ''X'' given past values of ''Y''. More specifically, if X_t and Y_t for t\in \mathbb denote two random processes and the amount of information is measured using
Shannon's entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
, the transfer entropy can be written as: : T_ = H\left( Y_t \mid Y_\right) - H\left( Y_t \mid Y_, X_\right), where ''H''(''X'') is Shannon entropy of ''X''. The above definition of transfer entropy has been extended by other types of
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
measures such as
Rényi entropy In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for th ...
. Transfer entropy is
conditional mutual information In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. Definition For random var ...
, with the history of the influenced variable Y_ in the condition: : T_ = I(Y_t ; X_ \mid Y_). Transfer entropy reduces to
Granger causality The Granger causality test is a statistical hypothesis test for determining whether one time series is useful in forecasting another, first proposed in 1969. Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued that cau ...
for vector auto-regressive processes. Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals. However, it usually requires more samples for accurate estimation. The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding. While it was originally defined for
bivariate analysis Bivariate analysis is one of the simplest forms of quantitative (statistical) analysis.Earl R. Babbie, ''The Practice of Social Research'', 12th edition, Wadsworth Publishing, 2009, , pp. 436–440 It involves the analysis of two variables (often ...
, transfer entropy has been extended to
multivariate Multivariate may refer to: In mathematics * Multivariable calculus * Multivariate function * Multivariate polynomial In computing * Multivariate cryptography * Multivariate division algorithm * Multivariate interpolation * Multivariate optical c ...
forms, either conditioning on other potential source variables or considering transfer from a collection of sources, although these forms require more samples again. Transfer entropy has been used for estimation of
functional connectivity Resting state fMRI (rs-fMRI or R-fMRI) is a method of functional magnetic resonance imaging (fMRI) that is used in brain mapping to evaluate regional interactions that occur in a resting or task-negative state, when an explicit task is not being ...
of
neurons A neuron, neurone, or nerve cell is an electrically excitable cell that communicates with other cells via specialized connections called synapses. The neuron is the main component of nervous tissue in all animals except sponges and placozoa. N ...
,
social influence Social influence comprises the ways in which individuals adjust their behavior to meet the demands of a social environment. It takes many forms and can be seen in conformity, socialization, peer pressure, obedience (human behavior), obedience, lead ...
in
social networks A social network is a social structure made up of a set of social actors (such as individuals or organizations), sets of dyadic ties, and other social interactions between actors. The social network perspective provides a set of methods for an ...
and statistical causality between armed conflict events. Transfer entropy is a finite version of the
Directed Information Directed information, I(X^n\to Y^n) , is an information theory measure that quantifies the information flow from the random process X^n = \ to the random process Y^n = \. The term ''directed information'' was coined by James Massey and is defined ...
which was defined in 1990 by
James Massey James Lee Massey (February 11, 1934 – June 16, 2013) was an American information theorist and cryptographer, Professor Emeritus of Digital Technology at ETH Zurich. His notable work includes the application of the Berlekamp–Massey algorithm to ...
as I(X^n\to Y^n) =\sum_^n I(X^i;Y_i, Y^), where X^n denotes the vector X_1,X_2,...,X_n and Y^n denotes Y_1,Y_2,...,Y_n. The
directed information Directed information, I(X^n\to Y^n) , is an information theory measure that quantifies the information flow from the random process X^n = \ to the random process Y^n = \. The term ''directed information'' was coined by James Massey and is defined ...
places an important role in characterizing the fundamental limits ( channel capacity) of communication channels with or without feedback and
gambling Gambling (also known as betting or gaming) is the wagering of something of value ("the stakes") on a random event with the intent of winning something else of value, where instances of strategy are discounted. Gambling thus requires three el ...
with causal side information.


See also

*
Conditional mutual information In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. Definition For random var ...
*
Causality Causality (also referred to as causation, or cause and effect) is influence by which one event, process, state, or object (''a'' ''cause'') contributes to the production of another event, process, state, or object (an ''effect'') where the cau ...
*
Causality (physics) Causality is the relationship between causes and effects. While causality is also a topic studied from the perspectives of philosophy and physics, it is operationalized so that causes of an event must be in the past light cone of the event and ...
*
Structural equation modeling Structural equation modeling (SEM) is a label for a diverse set of methods used by scientists in both experimental and observational research across the sciences, business, and other fields. It is used most in the social and behavioral scienc ...
*
Rubin causal model The Rubin causal model (RCM), also known as the Neyman–Rubin causal model, is an approach to the statistical analysis of cause and effect based on the framework of potential outcomes, named after Donald Rubin. The name "Rubin causal model" was ...
*
Mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such ...


References


External links

* , a toolbox, developed in
C++ C++ (pronounced "C plus plus") is a high-level general-purpose programming language created by Danish computer scientist Bjarne Stroustrup as an extension of the C programming language, or "C with Classes". The language has expanded significan ...
and
MATLAB MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation ...
, for computation of transfer entropy between spike trains. * , a toolbox, developed in
Java Java (; id, Jawa, ; jv, ꦗꦮ; su, ) is one of the Greater Sunda Islands in Indonesia. It is bordered by the Indian Ocean to the south and the Java Sea to the north. With a population of 151.6 million people, Java is the world's List ...
and usable in
MATLAB MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation ...
,
GNU Octave GNU Octave is a high-level programming language primarily intended for scientific computing and numerical computation. Octave helps in solving linear and nonlinear problems numerically, and for performing other numerical experiments using a langu ...
and
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (pro ...
, for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data. * {{cite web, title=Multivariate Transfer Entropy (MuTE) toolbox, url=https://github.com/montaltoalessandro/MuTE, publisher=
GitHub GitHub, Inc. () is an Internet hosting service for software development and version control using Git. It provides the distributed version control of Git plus access control, bug tracking, software feature requests, task management, continuous ...
, date=2019-01-09, a toolbox, developed in
MATLAB MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation ...
, for computation of transfer entropy with different estimators. Causality Nonlinear time series analysis Nonparametric statistics Entropy and information