4.6 Article

A Characterization of Entropy in Terms of Information Loss

Journal

ENTROPY
Volume 13, Issue 11, Pages 1945-1957

Publisher

MDPI
DOI: 10.3390/e13111945

Keywords

Shannon entropy; Tsallis entropy; information theory; measure-preserving function

Funding

  1. EU STREP QCS
  2. EPSRC
  3. EPSRC [EP/D073537/1] Funding Source: UKRI
  4. Engineering and Physical Sciences Research Council [EP/D073537/1] Funding Source: researchfish

Ask authors/readers for more resources

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the information loss, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available