Entropic Uncertainty
   HOME

TheInfoList



OR:

In
quantum mechanics Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, ...
,
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, and
Fourier analysis In mathematics, Fourier analysis () is the study of the way general functions may be represented or approximated by sums of simpler trigonometric functions. Fourier analysis grew from the study of Fourier series, and is named after Josep ...
, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's
uncertainty principle In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physic ...
can be expressed as a lower bound on the sum of these entropies. This is ''stronger'' than the usual statement of the uncertainty principle in terms of the product of standard deviations. In 1957, Hirschman considered a function ''f'' and its
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed, ...
''g'' such that :g(y) \approx \int_^\infty \exp (-2\pi ixy) f(x)\, dx,\qquad f(x) \approx \int_^\infty \exp (2\pi ixy) g(y)\, dy ~, where the "≈" indicates convergence in 2, and normalized so that (by
Plancherel's theorem In mathematics, the Plancherel theorem (sometimes called the Parseval–Plancherel identity) is a result in harmonic analysis, proven by Michel Plancherel in 1910. It states that the integral of a function's squared modulus is equal to the integr ...
), : \int_^\infty , f(x), ^2\, dx = \int_^\infty , g(y), ^2 \,dy = 1~. He showed that for any such functions the sum of the Shannon entropies is non-negative, : H(, f, ^2) + H(, g, ^2) \equiv - \int_^\infty , f(x), ^2 \log , f(x), ^2\, dx - \int_^\infty , g(y), ^2 \log , g(y), ^2 \,dy \ge 0. A tighter bound, was conjectured by Hirschman and Everett, proven in 1975 by W. Beckner and in the same year interpreted as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski. The equality holds in the case of Gaussian distributions. Note, however, that the above entropic uncertainty function is distinctly ''different'' from the quantum
Von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...
represented in
phase space In dynamical system theory, a phase space is a space in which all possible states of a system are represented, with each possible state corresponding to one unique point in the phase space. For mechanical systems, the phase space usually ...
.


Sketch of proof

The proof of this tight inequality depends on the so-called (''q'', ''p'')-norm of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.) From this norm, one is able to establish a lower bound on the sum of the (differential) Rényi entropies, , where , which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited.


Babenko–Beckner inequality

The (''q'', ''p'')-norm of the Fourier transform is defined to be :\, \mathcal F\, _ = \sup_ \frac, where 1 < p \le 2~,   and \frac 1 p + \frac 1 q = 1. In 1961, Babenko found this norm for ''even'' integer values of ''q''. Finally, in 1975, using
Hermite functions In mathematics, the Hermite polynomials are a classical orthogonal polynomial sequence. The polynomials arise in: * signal processing as Hermitian wavelets for wavelet transform analysis * probability, such as the Edgeworth series, as well ...
as eigenfunctions of the Fourier transform, Beckner proved that the value of this norm (in one dimension) for all ''q'' ≥ 2 is :\, \mathcal F\, _ = \sqrt. Thus we have the
Babenko–Beckner inequality In mathematics, the Babenko–Beckner inequality (after K. Ivan Babenko and William E. Beckner) is a sharpened form of the Hausdorff–Young inequality having applications to uncertainty principles in the Fourier analysis of Lp spaces. The (''q' ...
that :\, \mathcal Ff\, _q \le \left(p^/q^\right)^ \, f\, _p.


Rényi entropy bound

From this inequality, an expression of the uncertainty principle in terms of the
Rényi entropy In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for th ...
can be derived.H.P. Heinig and M. Smith, ''Extensions of the Heisenberg–Weil inequality.'' Internat. J. Math. & Math. Sci., Vol. 9, No. 1 (1986) pp. 185–192

/ref> Letting g=\mathcal Ff, 2''α''=''p'', and 2''β''=''q'', so that and 1/2<''α''<1<''β'', we have :\left(\int_ , g(y), ^\,dy\right)^ \le \frac \left(\int_ , f(x), ^\,dx\right)^. Squaring both sides and taking the logarithm, we get :\frac 1\beta \log\left(\int_ , g(y), ^\,dy\right) \le \frac 1 2 \log\frac + \frac 1\alpha \log \left(\int_ , f(x), ^\,dx\right). Multiplying both sides by :\frac=-\frac reverses the sense of the inequality, :\frac \log\left(\int_ , g(y), ^\,dy\right) \ge \frac\alpha\log\frac - \frac \log \left(\int_ , f(x), ^\,dx\right) ~. Rearranging terms, finally yields an inequality in terms of the sum of the Rényi entropies, :\frac \log \left(\int_ , f(x), ^\,dx\right) + \frac \log\left(\int_ , g(y), ^\,dy\right) \ge \frac\alpha\log\frac; : H_\alpha(, f, ^2) + H_\beta(, g, ^2) \ge \frac 1 2 \left(\frac+\frac\right) - \log 2 ~. Note that this inequality is symmetric with respect to and : One no longer need assume that ; only that they are positive and not both one, and that ''1/α + 1/β'' = 2. To see this symmetry, simply exchange the rôles of ''i'' and −''i'' in the Fourier transform.


Shannon entropy bound

Taking the limit of this last inequality as ''α, β'' → 1 yields the less general Shannon entropy inequality, :H(, f, ^2) + H(, g, ^2) \ge \log\frac e 2,\quad\textrm\quad g(y) \approx \int_ e^f(x)\,dx~, valid for any base of logarithm, as long as we choose an appropriate unit of information, bit,
nat Nat or NAT may refer to: Computing * Network address translation (NAT), in computer networking Organizations * National Actors Theatre, New York City, U.S. * National AIDS trust, a British charity * National Archives of Thailand * National As ...
, etc. The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that ''ħ''=1 ), i.e., :H(, f, ^2) + H(, g, ^2) \ge \log(\pi e)\quad\textrm\quad g(y) \approx \frac 1\int_ e^f(x)\,dx~. In this case, the dilation of the Fourier transform absolute squared by a factor of 2 simply adds log(2) to its entropy.


Entropy versus variance bounds

The Gaussian or normal probability distribution plays an important role in the relationship between
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
and
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
: it is a problem of the
calculus of variations The calculus of variations (or Variational Calculus) is a field of mathematical analysis that uses variations, which are small changes in functions and functionals, to find maxima and minima of functionals: mappings from a set of functions t ...
to show that this distribution maximizes entropy for a given variance, and at the same time minimizes the variance for a given entropy. In fact, for any probability density function \phi on the real line, Shannon's entropy inequality specifies: :H(\phi) \le \log \sqrt , where ''H'' is the Shannon entropy and ''V'' is the variance, an inequality that is saturated only in the case of a
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
. Moreover, the Fourier transform of a Gaussian probability amplitude function is also Gaussian—and the absolute squares of both of these are Gaussian, too. This can then be used to derive the usual Robertson variance uncertainty inequality from the above entropic inequality, enabling ''the latter to be tighter than the former''. That is (for ''ħ''=1), exponentiating the Hirschman inequality and using Shannon's expression above, :1/2 \le \exp (H(, f, ^2)+H(, g, ^2)) /(2e\pi) \le \sqrt ~. Hirschman explained that entropy—his version of entropy was the negative of Shannon's—is a "measure of the concentration of probability distributionin a set of small measure." Thus ''a low or large negative Shannon entropy means that a considerable mass of the probability distribution is confined to a set of small measure''. Note that this set of small measure need not be contiguous; a probability distribution can have several concentrations of mass in intervals of small measure, and the entropy may still be low no matter how widely scattered those intervals are. This is not the case with the variance: variance measures the concentration of mass about the mean of the distribution, and a low variance means that a considerable mass of the probability distribution is concentrated in a ''contiguous interval'' of small measure. To formalize this distinction, we say that two probability density functions \phi_1 and \phi_2 are equimeasurable if :\forall \delta > 0,\,\mu\ = \mu\, where is the
Lebesgue measure In measure theory, a branch of mathematics, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of ''n''-dimensional Euclidean space. For ''n'' = 1, 2, or 3, it coincides wit ...
. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. The same is not true of variance, however. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function; and there exist rearrangements of arbitrarily high variance, (all having the same entropy.)


See also

* Inequalities in information theory *
Logarithmic Schrödinger equation In theoretical physics, the logarithmic Schrödinger equation (sometimes abbreviated as LNSE or LogSE) is one of the nonlinear modifications of Schrödinger's equation. It is a classical wave equation with applications to extensions of quantum mech ...
* Uncertainty principle * Riesz–Thorin theorem *
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed, ...


References


Further reading

* Jizba, P.; Ma,Y.; Hayes, A.; Dunningham, J.A. (2016). "One-parameter class of uncertainty relations based on entropy power". ''Phys. Rev. E'' 93 (6): 060104(R)
doi:10.1103/PhysRevE.93.060104
*
arXiv:math/0605510v1
* * * * {{DEFAULTSORT:Hirschman Uncertainty Quantum mechanical entropy Information theory Physical quantities Inequalities