In
information theory, the entropy of a
random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable
, which takes values in the alphabet
and is distributed according to