Post-processing posteriors over precision matrices to produce sparse graph estimates

Amir Bashir, Carlos M. Carvalho, P. Richard Hahn, M. Beatrix Jones

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


A variety of computationally efficient Bayesian models for the covariance matrix of a multivariate Gaussian distribution are available. However, all produce a relatively dense estimate of the precision matrix, and are therefore unsatisfactory when one wishes to use the precision matrix to consider the conditional independence structure of the data. This paper considers the posterior predictive distribution of model fit for these covariance models. We then undertake post-processing of the Bayes point estimate for the precision matrix to produce a sparse model whose expected fit lies within the upper 95% of the posterior predictive distribution of fit. The impact of the method for selecting the zero elements of the precision matrix is evaluated. Good results were obtained using models that encouraged a sparse posterior (G-Wishart, Bayesian adaptive graphical lasso) and selection using credible intervals. We also find that this approach is easily extended to the problem of finding a sparse set of elements that differ across a set of precision matrices, a natural summary when a common set of variables is observed under multiple conditions. We illustrate our findings with moderate dimensional data examples from finance and metabolomics.

Original languageEnglish (US)
Pages (from-to)1075-1090
Number of pages16
JournalBayesian Analysis
Issue number4
StatePublished - Dec 1 2019


  • Covariance selection
  • Decoupling shrinkage and selection
  • Gaussian graphical models
  • Posterior summary
  • Shrinkage prior

ASJC Scopus subject areas

  • Statistics and Probability
  • Applied Mathematics


Dive into the research topics of 'Post-processing posteriors over precision matrices to produce sparse graph estimates'. Together they form a unique fingerprint.

Cite this