Transfer Entropy
   HOME
*





Transfer Entropy
Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process ''X'' to another process ''Y'' is the amount of uncertainty reduced in future values of ''Y'' by knowing the past values of ''X'' given past values of ''Y''. More specifically, if X_t and Y_t for t\in \mathbb denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as: : T_ = H\left( Y_t \mid Y_\right) - H\left( Y_t \mid Y_, X_\right), where ''H''(''X'') is Shannon entropy of ''X''. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy. Transfer entropy is conditional mutual information, with the history of the influenced variable Y_ in the condition: : T_ = I(Y_t ; X_ \mid Y_). Transfer entropy reduces to Granger causality for vector auto-regressiv ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Non-parametric Statistics
Nonparametric statistics is the branch of statistics that is not based solely on parametrized families of probability distributions (common examples of parameters are the mean and variance). Nonparametric statistics is based on either being distribution-free or having a specified distribution but with the distribution's parameters unspecified. Nonparametric statistics includes both descriptive statistics and statistical inference. Nonparametric tests are often used when the assumptions of parametric tests are violated. Definitions The term "nonparametric statistics" has been imprecisely defined in the following two ways, among others: Applications and purpose Non-parametric methods are widely used for studying populations that take on a ranked order (such as movie reviews receiving one to four stars). The use of non-parametric methods may be necessary when data have a ranking but no clear numerical interpretation, such as when assessing preferences. In terms of levels of me ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Association For Computing Machinery
The Association for Computing Machinery (ACM) is a US-based international learned society for computing. It was founded in 1947 and is the world's largest scientific and educational computing society. The ACM is a non-profit professional membership group, claiming nearly 110,000 student and professional members . Its headquarters are in New York City. The ACM is an umbrella organization for academic and scholarly interests in computer science ( informatics). Its motto is "Advancing Computing as a Science & Profession". History In 1947, a notice was sent to various people: On January 10, 1947, at the Symposium on Large-Scale Digital Calculating Machinery at the Harvard computation Laboratory, Professor Samuel H. Caldwell of Massachusetts Institute of Technology spoke of the need for an association of those interested in computing machinery, and of the need for communication between them. ..After making some inquiries during May and June, we believe there is ample interest to ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Google Code
Google Developers (previously Google Code) , application programming interfaces (APIs), and technical resources. The site contains documentation on using Google developer tools and APIs—including discussion groups and blogs for developers using Google's developer products. There are APIs offered for almost all of Google's popular consumer products, like Google Maps, YouTube, Google Apps, and others. The site also features a variety of developer products and tools built specifically for developers. Google App Engine is a hosting service for web apps. Project Hosting gives users version control for open source code. Google Web Toolkit (GWT) allows developers to create Ajax applications in the Java programming language.(All languages) The site contains reference information for community based developer products that Google is involved with like Android from the Open Handset Alliance and OpenSocial from the OpenSocial Foundation. Google APIs Google offers a variety of APIs ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable. Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair (X,Y) is from the product of the marginal distributions of X and Y. MI is the expected value of the pointwise mutual information (PMI). The quantity was defined and analyzed by Claude Shannon in his landmark paper "A Mathemati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Rubin Causal Model
The Rubin causal model (RCM), also known as the Neyman–Rubin causal model, is an approach to the statistical analysis of cause and effect based on the framework of potential outcomes, named after Donald Rubin. The name "Rubin causal model" was first coined by Paul W. Holland. The potential outcomes framework was first proposed by Jerzy Neyman in his 1923 Master's thesis,Neyman, Jerzy. ''Sur les applications de la theorie des probabilites aux experiences agricoles: Essai des principes.'' Master's Thesis (1923). Excerpts reprinted in English, Statistical Science, Vol. 5, pp. 463–472. ( D. M. Dabrowska, and T. P. Speed, Translators.) though he discussed it only in the context of completely randomized experiments. Rubin extended it into a general framework for thinking about causation in both observational and experimental studies. Introduction The Rubin causal model is based on the idea of potential outcomes. For example, a person would have a particular income at age 4 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Structural Equation Modeling
Structural equation modeling (SEM) is a label for a diverse set of methods used by scientists in both experimental and observational research across the sciences, business, and other fields. It is used most in the social and behavioral sciences. A definition of SEM is difficult without reference to highly technical language, but a good starting place is the name itself. SEM involves the construction of a ''model'', to represent how various aspects of an observable or theoretical phenomenon are thought to be causally structurally related to one another. The ''structural'' aspect of the model implies theoretical associations between variables that represent the phenomenon under investigation. The postulated causal structuring is often depicted with arrows representing causal connections between variables (as in Figures 1 and 2) but these causal connections can be equivalently represented as equations. The causal structures imply that specific patterns of connections should appe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Causality (physics)
Causality is the relationship between causes and effects. While causality is also a topic studied from the perspectives of philosophy and physics, it is operationalized so that causes of an event must be in the past light cone of the event and ultimately reducible to fundamental interactions. Similarly, a cause cannot have an effect outside its future light cone. As a physical concept In classical physics, an effect cannot occur ''before'' its cause which is why solutions such as the advanced time solutions of the Liénard–Wiechert potential are discarded as physically meaningless. In both Einstein's theory of special and general relativity, causality means that an effect cannot occur from a cause that is not in the back (past) light cone of that event. Similarly, a cause cannot have an effect outside its front (future) light cone. These restrictions are consistent with the constraint that mass and energy that act as causal influences cannot travel faster than the speed of li ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Causality
Causality (also referred to as causation, or cause and effect) is influence by which one event, process, state, or object (''a'' ''cause'') contributes to the production of another event, process, state, or object (an ''effect'') where the cause is partly responsible for the effect, and the effect is partly dependent on the cause. In general, a process has many causes, which are also said to be ''causal factors'' for it, and all lie in its past. An effect can in turn be a cause of, or causal factor for, many other effects, which all lie in its future. Some writers have held that causality is metaphysically prior to notions of time and space. Causality is an abstraction that indicates how the world progresses. As such a basic concept, it is more apt as an explanation of other concepts of progression than as something to be explained by others more basic. The concept is like those of agency and efficacy. For this reason, a leap of intuition may be needed to grasp it. Accordin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Conditional Mutual Information
In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. Definition For random variables X, Y, and Z with support sets \mathcal, \mathcal and \mathcal, we define the conditional mutual information as This may be written in terms of the expectation operator: I(X;Y, Z) = \mathbb_Z P_ \otimes P_ )/math>. Thus I(X;Y, Z) is the expected (with respect to Z) Kullback–Leibler divergence from the conditional joint distribution P_ to the product of the conditional marginals P_ and P_. Compare with the definition of mutual information. In terms of PMFs for discrete distributions For discrete random variables X, Y, and Z with support sets \mathcal, \mathcal and \mathcal, the conditional mutual information I(X;Y, Z) is as follows : I(X;Y, Z) = \sum_ p_Z(z) \sum_ \sum_ p_(x,y, z) \log \frac where the marginal, joint, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Gambling
Gambling (also known as betting or gaming) is the wagering of something of value ("the stakes") on a random event with the intent of winning something else of value, where instances of strategy are discounted. Gambling thus requires three elements to be present: consideration (an amount wagered), risk (chance), and a prize. The outcome of the wager is often immediate, such as a single roll of dice, a spin of a roulette wheel, or a horse crossing the finish line, but longer time frames are also common, allowing wagers on the outcome of a future sports contest or even an entire sports season. The term "gaming" in this context typically refers to instances in which the activity has been specifically permitted by law. The two words are not mutually exclusive; ''i.e.'', a "gaming" company offers (legal) "gambling" activities to the public and may be regulated by one of many gaming control boards, for example, the Nevada Gaming Control Board. However, this distinction is not u ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Directed Information
Directed information, I(X^n\to Y^n) , is an information theory measure that quantifies the information flow from the random process X^n = \ to the random process Y^n = \. The term ''directed information'' was coined by James Massey and is defined as :I(X^n\to Y^n) \triangleq \sum_^n I(X^i;Y_i, Y^), where I(X^;Y_i, Y^) is the conditional mutual information I(X_1,X_2,...,X_;Y_i, Y_1,Y_2,...,Y_). The Directed information has many applications in problems where causality plays an important role such as channel capacity, capacity of channel with Feedback capacity, feedback, capacity of discrete memoryless networks with feedback, gambling with causal side information, Data compression, compression with causal side information, and in real-time control communication settings, statistical physics. Marko's theory of bidirectional communication Massey's directed information was motivated by Marko's work on developing a theory of bidirectional communication Estimation and optimization ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]