HOME

TheInfoList



OR:

In
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, joint
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
is a measure of the uncertainty associated with a set of variables.


Definition

The joint Shannon entropy (in bits) of two discrete
random variables A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers ...
X and Y with images \mathcal X and \mathcal Y is defined as :\Eta(X,Y) = -\sum_ \sum_ P(x,y) \log_2 (x,y)/math> where x and y are particular values of X and Y, respectively, P(x,y) is the
joint probability A joint or articulation (or articular surface) is the connection made between bones, ossicles, or other hard structures in the body which link an animal's skeletal system into a functional whole.Saladin, Ken. Anatomy & Physiology. 7th ed. McGra ...
of these values occurring together, and P(x,y) \log_2 (x,y)/math> is defined to be 0 if P(x,y)=0. For more than two random variables X_1, ..., X_n this expands to :\Eta(X_1, ..., X_n) = -\sum_ ... \sum_ P(x_1, ..., x_n) \log_2 (x_1, ..., x_n)/math> where x_1,...,x_n are particular values of X_1,...,X_n, respectively, P(x_1, ..., x_n) is the probability of these values occurring together, and P(x_1, ..., x_n) \log_2 (x_1, ..., x_n)/math> is defined to be 0 if P(x_1, ..., x_n)=0.


Properties


Nonnegativity

The joint entropy of a set of random variables is a nonnegative number. :\Eta(X,Y) \geq 0 :\Eta(X_1,\ldots, X_n) \geq 0


Greater than individual entropies

The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set. :\Eta(X,Y) \geq \max \left Eta(X),\Eta(Y) \right/math> :\Eta \bigl(X_1,\ldots, X_n \bigr) \geq \max_ \Bigl\


Less than or equal to the sum of individual entropies

The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of
subadditivity In mathematics, subadditivity is a property of a function that states, roughly, that evaluating the function for the sum of two elements of the domain always returns something less than or equal to the sum of the function's values at each element ...
. This inequality is an equality if and only if X and Y are
statistically independent Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two event (probability theory), events are independent, statistically independent, or stochastically independent if, informally s ...
. :\Eta(X,Y) \leq \Eta(X) + \Eta(Y) :\Eta(X_1,\ldots, X_n) \leq \Eta(X_1) + \ldots + \Eta(X_n)


Relations to other entropy measures

Joint entropy is used in the definition of conditional entropy :\Eta(X, Y) = \Eta(X,Y) - \Eta(Y)\,, and :\Eta(X_1,\dots,X_n) = \sum_^n \Eta(X_k, X_,\dots, X_1). For two variables X and Y, this means that :\Eta(X,Y) = \Eta(Y) + \Eta(X, Y) = \Eta(X) + \Eta(Y, X). Joint entropy is also used in the definition of
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
:\operatorname(X;Y) = \Eta(X) + \Eta(Y) - \Eta(X,Y)\,. In
quantum information theory Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
, the joint entropy is generalized into the joint quantum entropy.


Joint differential entropy


Definition

The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called ''joint differential (or continuous) entropy''. Let X and Y be a continuous random variables with a joint probability density function f(x,y). The differential joint entropy h(X,Y) is defined as :h(X,Y) = -\int_ f(x,y)\log f(x,y)\,dx dy For more than two continuous random variables X_1, ..., X_n the definition is generalized to: :h(X_1, \ldots,X_n) = -\int f(x_1, \ldots,x_n)\log f(x_1, \ldots,x_n)\,dx_1 \ldots dx_n The
integral In mathematics, an integral is the continuous analog of a Summation, sum, which is used to calculate area, areas, volume, volumes, and their generalizations. Integration, the process of computing an integral, is one of the two fundamental oper ...
is taken over the support of f. It is possible that the integral does not exist in which case we say that the differential entropy is not defined.


Properties

As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables: :h(X_1,X_2, \ldots,X_n) \le \sum_^n h(X_i) The following chain rule holds for two random variables: :h(X,Y) = h(X, Y) + h(Y) In the case of more than two random variables this generalizes to: :h(X_1,X_2, \ldots,X_n) = \sum_^n h(X_i, X_1,X_2, \ldots,X_) Joint differential entropy is also used in the definition of the
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
between continuous random variables: :\operatorname(X,Y)=h(X)+h(Y)-h(X,Y)


References

{{Reflist Entropy and information de:Bedingte Entropie#Blockentropie