4.6 Article

Empirical Estimation of Information Measures: A Literature Guide

Journal

ENTROPY
Volume 21, Issue 8, Pages -

Publisher

MDPI
DOI: 10.3390/e21080720

Keywords

information measures; empirical estimators; entropy; relative entropy; mutual information; universal estimation

Funding

  1. US National Science Foundation [CCF-1016625]
  2. Center for Science of Information, an NSF Science and Technology Center [CCF-0939370]

Ask authors/readers for more resources

We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available