In
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, an entropy coding (or entropy encoding) is any
lossless data compression method that attempts to approach the lower bound declared by
Shannon's source coding theorem, which states that any lossless data compression method must have expected code length greater or equal to the entropy of the source.
More precisely, the source coding theorem states that for any source distribution, the expected code length satisfies