期刊
IEEE TRANSACTIONS ON INFORMATION THEORY
卷 65, 期 5, 页码 2899-2921出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2018.2883306
关键词
Statistical property testing; sampling; entropy estimation; quantum information; query complexity
资金
- NSF [CCF-1755800, CCF-1816695, CCF-1526380]
Estimation of Shannon and Renyi entropies of unknown discrete distributions is a fundamental problem in statistical property testing. In this paper, we give the first quantum algorithms for estimating alpha-Renyi entropies (Shannon entropy being 1-Renyi entropy). In particular, we demonstrate a quadratic quantum speedup for Shannon entropy estimation and a generic quantum speedup for alpha-Renyi entropy estimation for all alpha >= 0 values, including tight bounds for the Shannon entropy, the Hartley entropy (alpha = 0), and the collision entropy (alpha = 2). We also provide quantum upper bounds for estimating min-entropy (alpha = +infinity) as well as the Kullback-Leibler divergence. We complement our results with quantum lower bounds on alpha-Renyi entropy estimation for all alpha >= 0 values. Our approach is inspired by the pioneering work of Bravyi, Harrow, and Hassidim (BHH); however, with many new technical ingredients: 1) we improve the error dependence of the BHH framework by a fine-tuned error analysis together with Montanaro's approach to estimating the expected output of quantum subroutines for alpha = 0, 1; 2) we develop a procedure, similar to cooling schedules in simulated annealing, for general alpha >= 0, and 3) in the cases of integer alpha >= 2 and alpha = + infinity, we reduce the entropy estimation problem to the alpha-distinctness and inverted left perpendicular log n inverted right perpendicular-distinctness problems, respectively.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据