In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, the cross-entropy between two
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
s
and
, over the same underlying set of events, measures the average number of
bit
The bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as ...
s needed to identify an event drawn from the set when the coding scheme used for the set is optimized for an estimated probability distribution
, rather than the true distribution
.
Definition
The cross-entropy of the distribution
relative to a distribution
over a given set is defined as follows:
where