Hartley Entropy
   HOME

TheInfoList



OR:

The Hartley function is a measure of
uncertainty Uncertainty refers to epistemic situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown. Uncertainty arises in partially observable or ...
, introduced by
Ralph Hartley Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory. Biography Hartley was ...
in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function : H_0(A) := \mathrm_b \vert A \vert , where denotes the
cardinality In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of ''A''. If the base of the
logarithm In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a number  to the base  is the exponent to which must be raised, to produce . For example, since , the ''logarithm base'' 10 o ...
is 2, then the unit of uncertainty is the shannon (more commonly known as bit). If it is the
natural logarithm The natural logarithm of a number is its logarithm to the base of the mathematical constant , which is an irrational and transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if ...
, then the unit is the
nat Nat or NAT may refer to: Computing * Network address translation (NAT), in computer networking Organizations * National Actors Theatre, New York City, U.S. * National AIDS trust, a British charity * National Archives of Thailand * National As ...
. Hartley used a
base-ten logarithm In mathematics, the common logarithm is the logarithm with base 10. It is also known as the decadic logarithm and as the decimal logarithm, named after its base, or Briggsian logarithm, after Henry Briggs, an English mathematician who pioneered i ...
, and with this base, the unit of information is called the
hartley Hartley may refer to: Places Australia *Hartley, New South Wales *Hartley, South Australia **Electoral district of Hartley, a state electoral district Canada *Hartley Bay, British Columbia United Kingdom *Hartley, Cumbria *Hartley, Plymou ...
(aka
ban Ban, or BAN, may refer to: Law * Ban (law), a decree that prohibits something, sometimes a form of censorship, being denied from entering or using the place/item ** Imperial ban (''Reichsacht''), a form of outlawry in the medieval Holy Roman ...
or
dit DIT or dit may refer to: People * Dit name, an alternative family name, e.g., in French Canadian historical traditions * Dit Clapper (1907–1978), Canadian ice hockey player Information technology * Directory information tree * dit (unit), a ...
) in his honor. It is also known as the Hartley entropy.


Hartley function, Shannon entropy, and Rényi entropy

The Hartley function coincides with the
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Wi ...
(as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is a special case of the
Rényi entropy In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for th ...
since: :H_0(X) = \frac 1 \log \sum_^ p_i^0 = \log , X, . But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi, the Hartley function can be defined without introducing any notions of probability (see ''Uncertainty and information'' by George J. Klir, p. 423).


Characterization of the Hartley function

The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies # H(mn) = H(m)+H(n) (additivity) # H(m) \leq H(m+1) (monotonicity) # H(2)=1 (normalization) Condition 1 says that the uncertainty of the Cartesian product of two finite sets ''A'' and ''B'' is the sum of uncertainties of ''A'' and ''B''. Condition 2 says that a larger set has larger uncertainty.


Derivation of the Hartley function

We want to show that the Hartley function, log2(''n''), is the only function mapping natural numbers to real numbers that satisfies # H(mn) = H(m)+H(n)\, (additivity) # H(m) \leq H(m+1)\, (monotonicity) # H(2)=1\, (normalization) Let ''ƒ'' be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer ''n'' and ''k'', :f(n^k) = kf(n).\, Let ''a'', ''b'', and ''t'' be any positive integers. There is a unique integer ''s'' determined by :a^s \leq b^t \leq a^. \qquad(1) Therefore, :s \log_2 a\leq t \log_2 b \leq (s+1) \log_2 a \, and :\frac \leq \frac \leq \frac. On the other hand, by monotonicity, :f(a^s) \leq f(b^t) \leq f(a^). \, Using equation (1), one gets :s f(a) \leq t f(b) \leq (s+1) f(a),\, and :\frac \leq \frac \leq \frac. Hence, :\left\vert \frac - \frac \right\vert \leq \frac. Since ''t'' can be arbitrarily large, the difference on the left hand side of the above inequality must be zero, :\frac = \frac. So, :f(a) = \mu \log_2(a)\, for some constant ''μ'', which must be equal to 1 by the normalization property.


See also

*
Rényi entropy In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for th ...
* Min-entropy


References

* * {{PlanetMath attribution, id=6082, title=Derivation of Hartley function Information theory