Estimation information bounds using the I-MMSE formula and Gaussian mixture models

Bryan Paul, Daniel Bliss

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations

Abstract

We derive a method to bound the mutual information between a noisy and noiseless measurement exploiting the I-MMSE estimation and information theory connection. Modeling the source distribution as a Gaussian mixture model, a closed form expression for upper and lower bounds of the minimum mean square error is found using recent results. Using the connection between rate of information relative to SNR and the minimum mean square error of the estimator, the mutual information can be bounded as well for arbitrary source distributions in Gaussian noise.

Original languageEnglish (US)
Title of host publication2016 50th Annual Conference on Information Systems and Sciences, CISS 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages274-279
Number of pages6
ISBN (Electronic)9781467394574
DOIs
StatePublished - Apr 26 2016
Event50th Annual Conference on Information Systems and Sciences, CISS 2016 - Princeton, United States
Duration: Mar 16 2016Mar 18 2016

Other

Other50th Annual Conference on Information Systems and Sciences, CISS 2016
Country/TerritoryUnited States
CityPrinceton
Period3/16/163/18/16

Keywords

  • Bounds
  • Estimation Information
  • Gaussian Mixture Models
  • I-MMSE

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems

Fingerprint

Dive into the research topics of 'Estimation information bounds using the I-MMSE formula and Gaussian mixture models'. Together they form a unique fingerprint.

Cite this