Entropy Accumulation
- PDF / 727,050 Bytes
- 47 Pages / 439.37 x 666.142 pts Page_size
- 49 Downloads / 199 Views
Communications in
Mathematical Physics
Entropy Accumulation Frédéric Dupuis1,2,5 , Omar Fawzi3 , Renato Renner4 1 Université de Lorraine, CNRS, Inria, LORIA, 54000 Nancy, France 2 Faculty of Informatics, Masaryk University, Brno, Czech Republic 3 Laboratoire de l’Informatique du Parallélisme, ENS de Lyon, Lyon, France.
E-mail: [email protected]
4 Institute for Theoretical Physics, ETH Zurich, Zurich, Switzerland 5 Département d’informatique et de recherche opérationnelle, Université de Montréal, Montreal, Quebec,
Canada Received: 18 December 2017 / Accepted: 23 June 2020 Published online: 23 September 2020 – © The Author(s) 2020
Abstract: We ask the question whether entropy accumulates, in the sense that the operationally relevant total uncertainty about an n-partite system A = (A1 , . . . An ) corresponds to the sum of the entropies of its parts Ai . The Asymptotic Equipartition Property implies that this is indeed the case to first order in n—under the assumption that the parts Ai are identical and independent of each other. Here we show that entropy accumulation occurs more generally, i.e., without an independence assumption, provided one quantifies the uncertainty about the individual systems Ai by the von Neumann entropy of suitably chosen conditional states. The analysis of a large system can hence be reduced to the study of its parts. This is relevant for applications. In device-independent cryptography, for instance, the approach yields essentially optimal security bounds valid for general attacks, as shown by Arnon-Friedman et al. (SIAM J Comput 48(1):181–225, 2019). 1. Introduction In classical information theory, the uncertainty one has about a variable A given access to side information B can be operationally quantified by the number of bits one would need to learn, in addition to B, in order to reconstruct A. While this number generally fluctuates, it is—except with probability of order ε > 0—not larger than the ε-smooth ε (A|B) , evaluated for the joint distribution ρ of A and B [45].1 max-entropy, Hmax ρ ε (A|B) . Conversely, it is in the same way not smaller than the ε-smooth min-entropy, Hmin ρ This may be summarised by saying that the number of bits needed to reconstruct A from B is with probability at least 1 − O(ε) contained in the interval ε ε I = Hmin (1) (A|B)ρ , Hmax (A|B)ρ , 1 There is some freedom in how to count the number of bits, but the statement always holds up to additive terms of the order log(1/ε).
868
F. Dupuis, O. Fawzi, R. Renner
whose boundaries are defined by the smooth entropies. We refer to Definition 2.2 below for a precise definition of these quantities. This approach to quantifying uncertainty can be extended to the case where A and B are quantum systems. The conclusion remains the same: the operationally relevant uncertainty interval is I as defined by (1). The only difference is that ρ is now a density operator, which describes the joint state of A and B [41,44,51]. Finding the boundaries of the interval I is a central task of information theory. Howeve
Data Loading...