In
information theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, joint
entropy is a measure of the uncertainty associated with a set of
variables.
Definition
The joint
Shannon entropy (in
bits) of two discrete
random variables
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
and
with images
and
is defined as
where
and
are particular values of
and
, respectively,
is the
joint probability of these values occurring together, and