# Theoretical Physics

**2 July 2013**

**Time:**n/a

**Location:**

Petr Jizba (Prague)

On Role of information theoretic uncertainty relations in quantum theory.

The essence of quantum-mechanical uncertainty relations is to put an upper bound to the degree of concentration

of two (or more) probability distributions, or equivalently, impose a lower bound to the associated uncertainties. In

the usual Heisenberg’s uncertainty relations (HUR) the measure of uncertainty/concentration is a variance. While

the variance is often a good measure of the concentration of a given distribution, there are many situations where

this is not the case. Heavy tail distributions may serve as an example.

It might be thus desirable to quantify the inherent quantum unpredictability also in a different, and often more

expedient, way. A particularly distinct class of such non-variance-based uncertainty relations are the uncertainty

relations based on information theory. There the uncertainty is quantified in terms of various information measures

— entropies, which often provide more stringent (and intuitive) bound on concentrations of involved probability

distributions.

In this talk I first briefly review existent information-theoretic uncertainty relations. This will be done both for

discrete and continuous distribution functions. As a next step I extend these results to account for Renyi's entropy

and related differential entropy [1]. The concept of Renyi's entropy power will be axiomatically introduced and

the ensuing entropy power inequalities will be derived. In the usual Shannonian information theory the entropy

power inequalities are tightly connected with Fischer-information (or Cramer-Rao) inequality which yields the usual

(variance-based) HUR. In the present case the Fischer-information inequality boils down to 2-parametric class of

inequalities which reduce to the Cramer-Rao case only for a particular choice of parameters.

Another way how to generate the information-theoretic uncertainty relations is to use Deutsch's entropic uncertainty

principle [2]. With the help of the Riesz-Thorin inequality I will derived the corresponding generalization for R´enyi’s

entropy. I will also show a geometric illustration of this inequality in terms of Bhattacharyya’s distance [3]. This

will in turn reveal a close connection with a geodesic rule in the information geometry based on Fischer-Rao metric

[4]. This concept of estimation of inaccuracy fits well into a general geometric approach to statistical inference

which is based on the Fischer-Rao information metric. For this reason these uncertainty relations can be called

information-geometric uncertainty relations (IGUR).

The potency of above two types of uncertainty relations will be illustrated on a simple two-level system. There the

generalized Fischer-information inequalities substantially improve upon the usual HUR. Improvement is also achieved

on the level of IGUR which improve upon Deutsch’s entropic uncertainty relation. The obtained restrictions on

quantum distributions are of relevance, e.g, in the Jaynes-Cummings model.

[1] P. Jizba and T. Arimitsu, Ann. Phys. (NY) 312, 17 (2004).

[2] D.L. Deutsch, Phys. Rev. Lett. 50, 631 (1983).

[3] P. Jizba and T. Arimitsu, Phys. Rev. E 69, 026128 (2004).

[4] e.g., J. Burbea and C. Rao, Prob. Math. Statist. 3, 115 (1982).

© Copyright Leeds 2011