Journal
IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 68, Issue 3, Pages 1480-1481Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2021.3130189
Keywords
Strong data processing inequality (SDPI); contraction coefficient; f-divergence
Funding
- Israel Science Foundation (ISF) [1791/17]
- MIT-IBM Watson AI Laboratory
- National Science Foundation [ECCS-1808692, CCF-2131115]
Ask authors/readers for more resources
The strong data processing constant is defined as the smallest number eta(KL) within [0,1] such that certain conditions hold for any Markov chain U - X - Y. The value of eta(KL) is determined by the best binary-input subchannel of P-Y vertical bar X, which also applies for any f-divergence, confirming a conjecture from Cohen, Kemperman, and Zbaganu (1998).
For any channel P-Y vertical bar X the strong data processing constant is defined as the smallest number eta(KL) is an element of [0,1] such that I(U;Y) <= eta I-KL(U;X) holds for any Markov chain U - X - Y. It is shown that the value of eta(KL) is given by that of the best binary-input subchannel of P-Y vertical bar X. The same result holds for any f-divergence, verifying a conjecture of Cohen, Kemperman and Zbaganu (1998).
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available