In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, an entropy coding (or entropy encoding) is any
lossless data compression
Lossless compression is a class of data compression that allows the original data to be perfectly reconstructed from the compressed data with no loss of information. Lossless compression is possible because most real-world data exhibits Redundanc ...
method that attempts to approach the lower bound declared by
Shannon's source coding theorem
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the opera ...
, which states that any lossless data compression method must have an expected code length greater than or equal to the entropy of the source.
More precisely, the source coding theorem states that for any source distribution, the expected code length satisfies