Bridging Mixture Model Estimation and Information Bounds Using I-MMSE

Bryan Paul, Christian D. Chapman, Alex Rajan Chiriyath, Daniel Bliss

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


We derive bounds on mutual information for arbitrary estimation problems in additive noise, modeled using Gaussian mixtures. Previous work exploiting the I-minimum-mean-squared-error (MMSE) formula to formulate a bridge between bounds on the MMSE for Gaussian mixture model estimation problems and bounds on the mutual information are generalized to allow arbitrary noise modeling. A novel upper bound on estimation information is also developed for the general estimation case. In addition, limits are analyzed to develop bounds on arbitrary entropy, asymptotic behavior of all bounds, and bound errors with some results bridged back to the MMSE domain.

Original languageEnglish (US)
Article number7959624
Pages (from-to)4821-4832
Number of pages12
JournalIEEE Transactions on Signal Processing
Issue number18
StatePublished - Sep 15 2017


  • Gaussian mixture models
  • I-MMSE
  • MMSE
  • bounds
  • estimation information

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Bridging Mixture Model Estimation and Information Bounds Using I-MMSE'. Together they form a unique fingerprint.

Cite this