Journal
ENTROPY
Volume 13, Issue 11, Pages 1945-1957Publisher
MDPI
DOI: 10.3390/e13111945
Keywords
Shannon entropy; Tsallis entropy; information theory; measure-preserving function
Categories
Funding
- EU STREP QCS
- EPSRC
- EPSRC [EP/D073537/1] Funding Source: UKRI
- Engineering and Physical Sciences Research Council [EP/D073537/1] Funding Source: researchfish
Ask authors/readers for more resources
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the information loss, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available