Jump to content

Entropic uncertainty: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Yobot (talk | contribs)
m WP:CHECKWIKI error fixes using AWB (10331)
finite-dimensional Hilbert spaces; falls within subject of new article name
Line 1: Line 1:
In [[quantum mechanics]], [[information theory]], and [[Fourier analysis]], the '''entropic uncertainty''' or '''Hirschman uncertainty''' is defined as the sum of the temporal and spectral [[Shannon entropy|Shannon entropies]]. It turns out that Heisenberg's [[uncertainty principle]] can be expressed as a lower bound on the sum of these entropies. This is ''stronger'' than the usual statement of the uncertainty principle in terms of the product of standard deviations.
In [[quantum mechanics]] and [[information theory]] '''entropic uncertainty''' is the sum of the [[Entropy (information theory)|entropies]] of non-commuting observables. Various lower bounds on entropic uncertainty have been derived.

==Hirschman uncertainty==
'''Hirschman uncertainty''' is defined as the sum of the temporal and spectral '''[[Differential entropy|differential entropies]]'''. Take for example the position and momentum of a single particle in one dimension. It turns out that Heisenberg's [[uncertainty principle]] can be expressed as a lower bound on the sum of these entropies. This is ''stronger'' than the usual statement of the uncertainty principle in terms of the product of standard deviations.


In 1957,<ref name=Hirschman>{{Citation |first=I. I., Jr. |last=Hirschman |title=A note on entropy |journal=[[American Journal of Mathematics]] |year=1957 |volume=79 |issue=1 |pages=152–156 |doi=10.2307/2372390 |postscript=. |jstor=2372390 }}</ref> [[Isidore Isaac Hirschman, Jr.|Hirschman]] considered a function ''f'' and its [[Fourier transform]] ''g'' such that
In 1957,<ref name=Hirschman>{{Citation |first=I. I., Jr. |last=Hirschman |title=A note on entropy |journal=[[American Journal of Mathematics]] |year=1957 |volume=79 |issue=1 |pages=152–156 |doi=10.2307/2372390 |postscript=. |jstor=2372390 }}</ref> [[Isidore Isaac Hirschman, Jr.|Hirschman]] considered a function ''f'' and its [[Fourier transform]] ''g'' such that
Line 23: Line 26:
Note, however, that the above entropic uncertainty function is distinctly ''different'' from the quantum [[Von Neumann entropy]] represented in [[phase space]].
Note, however, that the above entropic uncertainty function is distinctly ''different'' from the quantum [[Von Neumann entropy]] represented in [[phase space]].


==Sketch of proof==
===Sketch of proof===
The proof of this tight inequality depends on the so-called '''(''q'',&nbsp;''p'')-norm''' of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.)
The proof of this tight inequality depends on the so-called '''(''q'',&nbsp;''p'')-norm''' of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.)


From this norm, one is able to establish a lower bound on the sum of the (differential) [[Rényi entropy|Rényi entropies]], {{math| ''H<sub>α</sub>({{!}}f{{!}}²)+H<sub>β</sub>({{!}}g{{!}}²)'' }}, where {{math|''1/α + 1/β'' {{=}} 2}}, which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited.
From this norm, one is able to establish a lower bound on the sum of the (differential) [[Rényi entropy|Rényi entropies]], {{math| ''H<sub>α</sub>({{!}}f{{!}}²)+H<sub>β</sub>({{!}}g{{!}}²)'' }}, where {{math|''1/α + 1/β'' {{=}} 2}}, which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited.


===Babenko–Beckner inequality===
====Babenko–Beckner inequality====
The '''(''q'',&nbsp;''p'')-norm''' of the Fourier transform is defined to be<ref name=Bialynicki>{{cite doi|10.1103/PhysRevA.74.052101|noedit}}</ref>
The '''(''q'',&nbsp;''p'')-norm''' of the Fourier transform is defined to be<ref name=Bialynicki>{{cite doi|10.1103/PhysRevA.74.052101|noedit}}</ref>


Line 39: Line 42:
:<math>\|\mathcal Ff\|_q \le \left(p^{1/p}/q^{1/q}\right)^{1/2} \|f\|_p.</math>
:<math>\|\mathcal Ff\|_q \le \left(p^{1/p}/q^{1/q}\right)^{1/2} \|f\|_p.</math>


===Rényi entropy bound===
====Rényi entropy bound====
From this inequality, an expression of the uncertainty principle in terms of the [[Rényi entropy]] can be derived.<ref name=Bialynicki/><ref>H.P. Heinig and M. Smith, ''Extensions of the Heisenberg–Weil inequality.'' Internat. J. Math. & Math. Sci., Vol. 9, No. 1 (1986) pp. 185&ndash;192. [http://www.hindawi.com/GetArticle.aspx?doi=10.1155/S0161171286000212]</ref>
From this inequality, an expression of the uncertainty principle in terms of the [[Rényi entropy]] can be derived.<ref name=Bialynicki/><ref>H.P. Heinig and M. Smith, ''Extensions of the Heisenberg–Weil inequality.'' Internat. J. Math. & Math. Sci., Vol. 9, No. 1 (1986) pp. 185&ndash;192. [http://www.hindawi.com/GetArticle.aspx?doi=10.1155/S0161171286000212]</ref>


Line 70: Line 73:
Note that this inequality is symmetric with respect to {{mvar|α}} and {{mvar|β}}: One no longer need assume that {{math|'' α<β''}}; only that they are positive and not both one, and that ''1/α + 1/β'' = 2. To see this symmetry, simply exchange the rôles of ''i'' and −''i'' in the Fourier transform.
Note that this inequality is symmetric with respect to {{mvar|α}} and {{mvar|β}}: One no longer need assume that {{math|'' α<β''}}; only that they are positive and not both one, and that ''1/α + 1/β'' = 2. To see this symmetry, simply exchange the rôles of ''i'' and −''i'' in the Fourier transform.


===Shannon entropy bound===
====Shannon entropy bound====
Taking the limit of this last inequality as ''α, β'' → 1 yields the less general Shannon entropy inequality,
Taking the limit of this last inequality as ''α, β'' → 1 yields the less general Shannon entropy inequality,
:<math>H(|f|^2) + H(|g|^2) \ge \log\frac e 2,\quad\textrm{where}\quad g(y) \approx \int_{\mathbb R} e^{-2\pi ixy}f(x)\,dx~,</math>
:<math>H(|f|^2) + H(|g|^2) \ge \log\frac e 2,\quad\textrm{where}\quad g(y) \approx \int_{\mathbb R} e^{-2\pi ixy}f(x)\,dx~,</math>
Line 79: Line 82:
In this case, the dilation of the Fourier transform absolute squared by a factor of 2{{mvar|π}} simply adds log(2{{mvar|π}}) to its entropy.
In this case, the dilation of the Fourier transform absolute squared by a factor of 2{{mvar|π}} simply adds log(2{{mvar|π}}) to its entropy.


==Entropy versus variance bounds==
===Entropy versus variance bounds===
The Gaussian or [[normal probability distribution]] plays an important role in the relationship between [[variance]] and [[Differential entropy|entropy]]: it is a problem of the [[calculus of variations]] to show that this distribution maximizes entropy for a given variance, and at the same time minimizes the variance for a given entropy. In fact, for any probability density function ''φ'' on the real line, Shannon's entropy inequality specifies:
The Gaussian or [[normal probability distribution]] plays an important role in the relationship between [[variance]] and [[Differential entropy|entropy]]: it is a problem of the [[calculus of variations]] to show that this distribution maximizes entropy for a given variance, and at the same time minimizes the variance for a given entropy. In fact, for any probability density function ''φ'' on the real line, Shannon's entropy inequality specifies:
:<math>H(\phi) \le \log \sqrt {2\pi eV(\phi)},</math>
:<math>H(\phi) \le \log \sqrt {2\pi eV(\phi)},</math>
Line 94: Line 97:
:<math>\forall \delta > 0,\,\mu\{x\in\mathbb R|\phi_1(x)\ge\delta\} = \mu\{x\in\mathbb R|\phi_2(x)\ge\delta\},</math>
:<math>\forall \delta > 0,\,\mu\{x\in\mathbb R|\phi_1(x)\ge\delta\} = \mu\{x\in\mathbb R|\phi_2(x)\ge\delta\},</math>
where {{mvar|μ}} is the [[Lebesgue measure]]. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. The same is not true of variance, however. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function; and there exist rearrangements of arbitrarily high variance, (all having the same entropy.)
where {{mvar|μ}} is the [[Lebesgue measure]]. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. The same is not true of variance, however. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function; and there exist rearrangements of arbitrarily high variance, (all having the same entropy.)

==Finite-dimensional Hilbert spaces==
More recently there has been research into entropic uncertainty relations in finite-dimensional Hilbert spaces. A much-cited reference is <ref>{{cite article|title = Generalized entropic uncertainty relations
|author = Maassen, Hans and Uffink, J.
|journal = Phys. Rev. Lett.,
|volume = 60
|issue = 12
|pages = 1103--1106
|year = 1988
|month = Mar
|publisher = American Physical Society
|doi = 10.1103/PhysRevLett.60.1103
|url = http://dare.uva.nl/document/2/46650}}</ref>.


==See also==
==See also==

Revision as of 22:41, 22 December 2014

In quantum mechanics and information theory entropic uncertainty is the sum of the entropies of non-commuting observables. Various lower bounds on entropic uncertainty have been derived.

Hirschman uncertainty

Hirschman uncertainty is defined as the sum of the temporal and spectral differential entropies. Take for example the position and momentum of a single particle in one dimension. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations.

In 1957,[1] Hirschman considered a function f and its Fourier transform g such that

where the "≈" indicates convergence in L2, and normalized so that (by Plancherel's theorem),

He showed that for any such functions the sum of the Shannon entropies is non-negative,

A tighter bound,

was conjectured by Hirschman[1] and Everett,[2] proven in 1975 by W. Beckner[3] and in the same year interpreted by as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski.[4] The equality holds in the case of Gaussian distributions.[5]

Note, however, that the above entropic uncertainty function is distinctly different from the quantum Von Neumann entropy represented in phase space.

Sketch of proof

The proof of this tight inequality depends on the so-called (qp)-norm of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.)

From this norm, one is able to establish a lower bound on the sum of the (differential) Rényi entropies, Hα(|f|²)+Hβ(|g|²) , where 1/α + 1/β = 2, which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited.

Babenko–Beckner inequality

The (qp)-norm of the Fourier transform is defined to be[6]

where   and

In 1961, Babenko[7] found this norm for even integer values of q. Finally, in 1975, using Hermite functions as eigenfunctions of the Fourier transform, Beckner[3] proved that the value of this norm (in one dimension) for all q ≥ 2 is

Thus we have the Babenko–Beckner inequality that

Rényi entropy bound

From this inequality, an expression of the uncertainty principle in terms of the Rényi entropy can be derived.[6][8]

Letting , 2α=p, and 2β=q, so that 1/α + 1/β = 2 and 1/2<α<1<β, we have

Squaring both sides and taking the logarithm, we get

Multiplying both sides by

reverses the sense of the inequality,

Rearranging terms, finally yields an inequality in terms of the sum of the Rényi entropies,

Note that this inequality is symmetric with respect to α and β: One no longer need assume that α<β; only that they are positive and not both one, and that 1/α + 1/β = 2. To see this symmetry, simply exchange the rôles of i and −i in the Fourier transform.

Shannon entropy bound

Taking the limit of this last inequality as α, β → 1 yields the less general Shannon entropy inequality,

valid for any base of logarithm, as long as we choose an appropriate unit of information, bit, nat, etc.

The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that ħ=1 ), i.e.,

In this case, the dilation of the Fourier transform absolute squared by a factor of 2π simply adds log(2π) to its entropy.

Entropy versus variance bounds

The Gaussian or normal probability distribution plays an important role in the relationship between variance and entropy: it is a problem of the calculus of variations to show that this distribution maximizes entropy for a given variance, and at the same time minimizes the variance for a given entropy. In fact, for any probability density function φ on the real line, Shannon's entropy inequality specifies:

where H is the Shannon entropy and V is the variance, an inequality that is saturated only in the case of a normal distribution.

Moreover, the Fourier transform of a Gaussian probability amplitude function is also Gaussian—and the absolute squares of both of these are Gaussian, too. This can then be used to derive the usual Robertson variance uncertainty inequality from the above entropic inequality, enabling the latter to be tighter than the former. That is (for ħ=1), exponentiating the Hirschman inequality and using Shannon's expression above,

Hirschman[1] explained that entropy—his version of entropy was the negative of Shannon's—is a "measure of the concentration of [a probability distribution] in a set of small measure." Thus a low or large negative Shannon entropy means that a considerable mass of the probability distribution is confined to a set of small measure.

Note that this set of small measure need not be contiguous; a probability distribution can have several concentrations of mass in intervals of small measure, and the entropy may still be low no matter how widely scattered those intervals are. This is not the case with the variance: variance measures the concentration of mass about the mean of the distribution, and a low variance means that a considerable mass of the probability distribution is concentrated in a contiguous interval of small measure.

To formalize this distinction, we say that two probability density functions φ1 and φ2 are equimeasurable if

where μ is the Lebesgue measure. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. The same is not true of variance, however. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function; and there exist rearrangements of arbitrarily high variance, (all having the same entropy.)

Finite-dimensional Hilbert spaces

More recently there has been research into entropic uncertainty relations in finite-dimensional Hilbert spaces. A much-cited reference is [9].

See also

References

  1. ^ a b c Hirschman, I. I., Jr. (1957), "A note on entropy", American Journal of Mathematics, 79 (1): 152–156, doi:10.2307/2372390, JSTOR 2372390.{{citation}}: CS1 maint: multiple names: authors list (link)
  2. ^ Hugh Everett, III. The Many-Worlds Interpretation of Quantum Mechanics: the theory of the universal wave function. Everett's Dissertation
  3. ^ a b Beckner, W. (1975), "Inequalities in Fourier analysis", Annals of Mathematics, 102 (6): 159–182, doi:10.2307/1970980, JSTOR 1970980.
  4. ^ Bialynicki-Birula, I.; Mycielski, J. (1975), "Uncertainty Relations for Information Entropy in Wave Mechanics", Communications in Mathematical Physics, 44 (2): 129, Bibcode:1975CMaPh..44..129B, doi:10.1007/BF01608825
  5. ^ Ozaydin, Murad; Przebinda, Tomasz (2004). "An Entropy-based Uncertainty Principle for a Locally Compact Abelian Group" (PDF). Journal of Functional Analysis. 215 (1). Elsevier Inc.: 241–252. doi:10.1016/j.jfa.2003.11.008. Retrieved 2011-06-23.
  6. ^ a b Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1103/PhysRevA.74.052101, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with |doi=10.1103/PhysRevA.74.052101 instead.
  7. ^ K.I. Babenko. An ineqality in the theory of Fourier analysis. Izv. Akad. Nauk SSSR, Ser. Mat. 25 (1961) pp. 531–542 English transl., Amer. Math. Soc. Transl. (2) 44, pp. 115-128
  8. ^ H.P. Heinig and M. Smith, Extensions of the Heisenberg–Weil inequality. Internat. J. Math. & Math. Sci., Vol. 9, No. 1 (1986) pp. 185–192. [1]
  9. ^ Template:Cite article

Further reading

  • Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1016/j.physa.2006.09.019, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with |doi=10.1016/j.physa.2006.09.019 instead. arXiv:math/0605510v1
  • Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1103/PhysRevLett.60.1103, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with |doi=10.1103/PhysRevLett.60.1103 instead.