Bayesian Ensemble Learning

Hugh A. Chipman, Edward I. George, Robert E. McCulloch

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Scopus citations

Abstract

We develop a Bayesian “sum-of-trees” model, named BART, where each tree is constrained by a prior to be a weak learner. Fitting and inference are accomplished via an iterative backfitting MCMC algorithm. This model is motivated by ensemble methods in general, and boosting algorithms in particular. Like boosting, each weak learner (i.e., each weak tree) contributes a small amount to the overall model. However, our procedure is defined by a statistical model: a prior and a likelihood, while boosting is defined by an algorithm. This model-based approach enables a full and accurate assessment of uncertainty in model predictions, while remaining highly competitive in terms of predictive accuracy.

Original languageEnglish (US)
Title of host publicationNIPS 2006
Subtitle of host publicationProceedings of the 19th International Conference on Neural Information Processing Systems
EditorsBernhard Scholkopf, John C. Platt, Thomas Hofmann
PublisherMIT Press Journals
Pages265-272
Number of pages8
ISBN (Electronic)0262195682, 9780262195683
StatePublished - 2006
Externally publishedYes
Event19th International Conference on Neural Information Processing Systems, NIPS 2006 - Vancouver, Canada
Duration: Dec 4 2006Dec 7 2006

Publication series

NameNIPS 2006: Proceedings of the 19th International Conference on Neural Information Processing Systems

Conference

Conference19th International Conference on Neural Information Processing Systems, NIPS 2006
Country/TerritoryCanada
CityVancouver
Period12/4/0612/7/06

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Bayesian Ensemble Learning'. Together they form a unique fingerprint.

Cite this