Markov Logic Networks
   HOME

TheInfoList



OR:

A Markov logic network (MLN) is a probabilistic logic which applies the ideas of a
Markov network In the domain of physics and probability, a Markov random field (MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said to b ...
to first-order logic, enabling
uncertain inference Uncertain inference was first described by C. J. van Rijsbergen as a way to formally define a query and document relationship in Information retrieval. This formalization is a logical implication with an attached measure of uncertainty. Definitio ...
. Markov logic networks generalize first-order logic, in the sense that, in a certain limit, all
unsatisfiable In mathematical logic, a Well-formed formula, formula is ''satisfiable'' if it is true under some assignment of values to its Variable (mathematics), variables. For example, the formula x+3=y is satisfiable because it is true when x=3 and y=6, whil ...
statements have a probability of zero, and all tautologies have probability one.


History

Work in this area began in 2003 by
Pedro Domingos Pedro Domingos is a Professor Emeritus of computer science and engineering at the University of Washington. He is a researcher in machine learning known for Markov logic network enabling uncertain inference. Education Domingos received an und ...
and Matt Richardson, and they began to use the term MLN to describe it.


Description

Briefly, it is a collection of
formula In science, a formula is a concise way of expressing information symbolically, as in a mathematical formula or a ''chemical formula''. The informal use of the term ''formula'' in science refers to the general construct of a relationship betwee ...
s from first-order logic, to each of which is assigned a
real number In mathematics, a real number is a number that can be used to measure a ''continuous'' one-dimensional quantity such as a distance, duration or temperature. Here, ''continuous'' means that values can have arbitrarily small variations. Every real ...
, the weight. Taken as a Markov network, the vertices of the network graph are atomic formulas, and the edges are the
logical connective In logic, a logical connective (also called a logical operator, sentential connective, or sentential operator) is a logical constant. They can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary co ...
s used to construct the formula. Each formula is considered to be a clique, and the
Markov blanket In statistics and machine learning, when one wants to infer a random variable with a set of variables, usually a subset is enough, and other variables are useless. Such a subset that contains all the useful information is called a Markov blanket. ...
is the set of formulas in which a given atom appears. A potential function is associated to each formula, and takes the value of one when the formula is true, and zero when it is false. The potential function is combined with the weight to form the Gibbs measure and partition function for the Markov network. The above definition glosses over a subtle point: atomic formulas do not have a
truth value In logic and mathematics, a truth value, sometimes called a logical value, is a value indicating the relation of a proposition to truth, which in classical logic has only two possible values (''true'' or '' false''). Computing In some progr ...
unless they are grounded and given an interpretation; that is, until they are
ground atom In mathematical logic, a ground term of a formal system is a term that does not contain any variables. Similarly, a ground formula is a formula that does not contain any variables. In first-order logic with identity, the sentence Q(a) \lor P(b) ...
s with a
Herbrand interpretation In mathematical logic, a Herbrand interpretation is an interpretation in which all constants and function symbols are assigned very simple meanings. Specifically, every constant is interpreted as itself, and every function symbol is interpreted a ...
. Thus, a Markov logic network becomes a Markov network only with respect to a specific grounding and interpretation; the resulting Markov network is called the ground Markov network. The vertices of the graph of the ground Markov network are the ground atoms. The size of the resulting Markov network thus depends strongly (exponentially) on the number of constants in the
domain of discourse In the formal sciences, the domain of discourse, also called the universe of discourse, universal set, or simply universe, is the set of entities over which certain variables of interest in some formal treatment may range. Overview The domain ...
.


Inference

The goal of inference in a Markov logic network is to find the stationary distribution of the system, or one that is close to it; that this may be difficult or not always possible is illustrated by the richness of behaviour seen in the Ising model. As in a Markov network, the stationary distribution finds the most likely assignment of probabilities to the vertices of the graph; in this case, the vertices are the ground atoms of an interpretation. That is, the distribution indicates the probability of the truth or falsehood of each ground atom. Given the stationary distribution, one can then perform inference in the traditional statistical sense of conditional probability: obtain the probability P(A, B) that formula A holds, given that formula B is true. Inference in MLNs can be performed using standard Markov network inference techniques over the minimal subset of the relevant Markov network required for answering the query. These techniques include
Gibbs sampling In statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is dif ...
, which is effective but may be excessively slow for large networks, belief propagation, or approximation via
pseudolikelihood In statistical theory, a pseudolikelihood is an approximation to the joint probability distribution of a collection of random variables. The practical use of this is that it can provide an approximation to the likelihood function of a set of observ ...
.


See also

* Markov random field *
Statistical relational learning Statistical relational learning (SRL) is a subdiscipline of artificial intelligence and machine learning that is concerned with domain models that exhibit both uncertainty (which can be dealt with using statistical methods) and complex, relational ...
*
Probabilistic logic network A probabilistic logic network (PLN) is a conceptual, mathematical, and computational approach to uncertain inference; inspired by logic programming, but using probabilities in place of crisp (true/false) truth values, and fractional uncertainty in ...
*
Probabilistic soft logic Probabilistic Soft Logic (PSL) is a statistical relational learning (SRL) framework for modeling probabilistic and relational domains. It is applicable to a variety of machine learning problems, such as collective classification, entity reso ...


Resources

{{Reflist


External links


University of Washington Statistical Relational Learning group

Alchemy 2.0: Markov logic networks in C++

pracmln: Markov logic networks in Python

ProbCog: Markov logic networks in Python and Java that can use its own inference engine or Alchemy's

markov thebeast: Markov logic networks in Java

RockIt: Markov logic networks in Java (with web interface/REST API)

Tuffy: A Learning and Inference Engine with strong RDBMs-based optimization for scalability

Felix: A successor to Tuffy, with prebuilt submodules to speed up common subtasks


* ttps://www.cra.com/commercial-solutions/probabilistic-modeling-services.asp Figaro: Scala based MLN language
LoMRF: Logical Markov Random Fields, an open-source implementation of Markov Logic Networks in Scala
Bayesian statistics Markov networks