4.6 Article

Optimized quantum f-divergences and data processing

出版社

IOP PUBLISHING LTD
DOI: 10.1088/1751-8121/aad5a1

关键词

optimized f-divergence; data processing; sandwiched Renyi relative entropy; Petz-Renyi relative entropy

资金

  1. National Science Foundation [1714215]
  2. Direct For Computer & Info Scie & Enginr
  3. Division of Computing and Communication Foundations [1714215] Funding Source: National Science Foundation

向作者/读者索取更多资源

The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz-Renyi relative entropies. In this paper, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petz's original approach. Interestingly, the sandwiched Renyi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this paper is that there is now a single, unified approach for establishing the data processing inequality for both the Petz-Renyi and sandwiched Renyi relative entropies, for the full range of parameters for which it is known to hold. This paper discusses other aspects of the optimized f-divergence, such as the classical case, the classical-quantum case, and how to construct optimized f-information measures.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据