Causal Markov condition
   HOME

TheInfoList



OR:

The Markov condition, sometimes called the Markov assumption, is an assumption made in
Bayesian probability theory Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification ...
, that every node in a
Bayesian network A Bayesian network (also known as a Bayes network, Bayes net, belief network, or decision network) is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). Bay ...
is
conditionally independent In probability theory, conditional independence describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of conditional probabi ...
of its nondescendants, given its parents. Stated loosely, it is assumed that a node has no bearing on nodes which do not descend from it. In a DAG, this local Markov condition is equivalent to the global Markov condition, which states that d-separations in the graph also correspond to conditional independence relations. This also means that a node is conditionally independent of the entire network, given its
Markov blanket In statistics and machine learning, when one wants to infer a random variable with a set of variables, usually a subset is enough, and other variables are useless. Such a subset that contains all the useful information is called a Markov blanket. ...
. The related Causal Markov (CM) condition states that, conditional on the set of all its direct causes, a node is independent of all variables which are not effects or direct causes of that node. In the event that the structure of a Bayesian network accurately depicts causality, the two conditions are equivalent. However, a network may accurately embody the Markov condition without depicting causality, in which case it should not be assumed to embody the causal Markov condition.


Definition

Let G be an acyclic
causal graph In statistics, econometrics, epidemiology, genetics and related disciplines, causal graphs (also known as path diagrams, causal Bayesian networks or DAGs) are probabilistic graphical models used to encode assumptions about the data-generating pr ...
(a graph in which each node appears only once along any
path A path is a route for physical travel – see Trail. Path or PATH may also refer to: Physical paths of different types * Bicycle path * Bridle path, used by people on horseback * Course (navigation), the intended path of a vehicle * Desire p ...
) with vertex set V and let P be a probability distribution over the vertices in V generated by G. G and P satisfy the Causal Markov Condition if every node ''X'' in V is independent of NonDescendants(X) given Parents(X).


Motivation

Statisticians are enormously interested in the ways in which certain events and variables are connected. The precise notion of what constitutes a cause and effect is necessary to understand the connections between them. The central idea behind the philosophical study of causation is that causes raise the probabilities of their effects, all else being equal. A deterministic interpretation of causation means that if ''A'' causes ''B'', then ''A'' must ''always'' be followed by ''B''. In this sense, smoking does not cause cancer because some smokers never develop cancer. On the other hand, a probabilistic interpretation simply means that causes raise the probability of their effects. In this sense, changes in meteorological readings associated with a storm do cause that storm, since they raise its probability. (However, simply looking at a barometer does not change the probability of the storm, for a more detailed analysis, see:).


Implications


Dependence and Causation

It follows from the definition that if ''X'' and ''Y'' are in V and are probabilistically dependent, then either ''X'' causes ''Y'', ''Y'' causes ''X'', or ''X'' and ''Y'' are both effects of some common cause ''Z'' in V. This definition was seminally introduced by Hans Reichenbach as the Common Cause Principle (CCP)


Screening

It once again follows from the definition that the parents of ''X'' screen ''X'' from other "indirect causes" of ''X'' (parents of Parents(''X'')) and other effects of Parents(''X'') which are not also effects of ''X''.


Examples

In a simple view, releasing one's hand from a hammer causes the hammer to fall. However, doing so in outer space does not produce the same outcome, calling into question if releasing one's fingers from a hammer ''always'' causes it to fall. A causal graph could be created to acknowledge that both the presence of gravity and the release of the hammer contribute to its falling. However, it would be very surprising if the surface underneath the hammer affected its falling. This essentially states the Causal Markov Condition, that given the existence of gravity the release of the hammer, it will fall regardless of what is beneath it.


See also

*
Causal model In the philosophy of science, a causal model (or structural causal model) is a conceptual model that describes the causal mechanisms of a system. Causal models can improve study designs by providing clear rules for deciding which independent va ...


Notes

{{DEFAULTSORT:Causal Markov Condition Bayesian networks Causality