HOME

TheInfoList



OR:

In
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, joint entropy is a measure of the uncertainty associated with a set of variables.


Definition

The joint Shannon entropy (in bits) of two discrete
random variables A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
X and Y with images \mathcal X and \mathcal Y is defined as where x and y are particular values of X and Y, respectively, P(x,y) is the joint probability of these values occurring together, and P(x,y) \log_2 (x,y)/math> is defined to be 0 if P(x,y)=0. For more than two random variables X_1, ..., X_n this expands to where x_1,...,x_n are particular values of X_1,...,X_n, respectively, P(x_1, ..., x_n) is the probability of these values occurring together, and P(x_1, ..., x_n) \log_2 (x_1, ..., x_n)/math> is defined to be 0 if P(x_1, ..., x_n)=0.


Properties


Nonnegativity

The joint entropy of a set of random variables is a nonnegative number. :\Eta(X,Y) \geq 0 :\Eta(X_1,\ldots, X_n) \geq 0


Greater than individual entropies

The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set. :\Eta(X,Y) \geq \max \left Eta(X),\Eta(Y) \right/math> :\Eta \bigl(X_1,\ldots, X_n \bigr) \geq \max_ \Bigl\


Less than or equal to the sum of individual entropies

The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set. This is an example of
subadditivity In mathematics, subadditivity is a property of a function that states, roughly, that evaluating the function for the sum of two elements of the domain always returns something less than or equal to the sum of the function's values at each element. ...
. This inequality is an equality if and only if X and Y are statistically independent. :\Eta(X,Y) \leq \Eta(X) + \Eta(Y) :\Eta(X_1,\ldots, X_n) \leq \Eta(X_1) + \ldots + \Eta(X_n)


Relations to other entropy measures

Joint entropy is used in the definition of conditional entropy :\Eta(X, Y) = \Eta(X,Y) - \Eta(Y)\,, and \Eta(X_1,\dots,X_n) = \sum_^n \Eta(X_k, X_,\dots, X_1)It is also used in the definition of mutual information :\operatorname(X;Y) = \Eta(X) + \Eta(Y) - \Eta(X,Y)\, In quantum information theory, the joint entropy is generalized into the
joint quantum entropy The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states \rho and \sigma, represented as density operators that are subparts of a quantum system, the jo ...
.


Joint differential entropy


Definition

The above definition is for discrete random variables and just as valid in the case of continuous random variables. The continuous version of discrete joint entropy is called ''joint differential (or continuous) entropy''. Let X and Y be a continuous random variables with a joint probability density function f(x,y). The differential joint entropy h(X,Y) is defined as For more than two continuous random variables X_1, ..., X_n the definition is generalized to: The integral is taken over the support of f. It is possible that the integral does not exist in which case we say that the differential entropy is not defined.


Properties

As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables: :h(X_1,X_2, \ldots,X_n) \le \sum_^n h(X_i) The following chain rule holds for two random variables: :h(X,Y) = h(X, Y) + h(Y) In the case of more than two random variables this generalizes to: :h(X_1,X_2, \ldots,X_n) = \sum_^n h(X_i, X_1,X_2, \ldots,X_) Joint differential entropy is also used in the definition of the mutual information between continuous random variables: :\operatorname(X,Y)=h(X)+h(Y)-h(X,Y)


References

{{Reflist Entropy and information de:Bedingte Entropie#Blockentropie