4.7 Article

MMSE Bounds for Additive Noise Channels Under Kullback-Leibler Divergence Constraints on the Input Distribution

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 67, Issue 24, Pages 6352-6367

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2019.2951221

Keywords

Upper bound; Covariance matrices; Gaussian distribution; Estimation; Entropy; Mathematical model; Signal to noise ratio; MMSE bounds; robust estimation; minimax optimization; Cramer-Rao bound; Stam's inequality; entropy power inequality

Funding

  1. U.S. National Science Foundation [CCF-1513915]
  2. German Research Foundation (DFG) [424522268]

Ask authors/readers for more resources

Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback-Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is defined implicitly via a system of non-linear equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound provides an alternative to well-known inequalities in estimation and information theory-such as the Cramr-Rao lower bound, Stams inequality, or the entropy power inequality-that is potentially tighter and defined for a larger class of input distributions. Several examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available