The increased availability of a wide range of sensing technologies over the last few decades has resulted in an equivalent increased need for reliable information fusion methods in machine learning applications. While existing theories such as the Dempster-Shafer theory and the possibility theory have been used for several years now, they do not provide guarantees of error calibration in information fusion settings. The Conformal Predictions (CP) framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. In this work, we present a methodology to extend the Conformal Predictions framework to both classification and regression-based information fusion settings. This methodology is based on applying the CP framework to each data source as an independent hypothesis test, and subsequently using p-value combination methods as a test statistic for the combined hypothesis after fusion. The proposed methodology was studied in classification and regression settings within two real-world application contexts: person recognition using multiple modalities (classification), and head pose estimation using multiple image features (regression). Our experimental results showed that quantile methods of combining p-values (such as the Standard Normal Function and the Non-conformity Aggregation methods) provided the most statistically valid calibration results, and can be considered to extend the CP framework for information fusion settings.

Original languageEnglish (US)
Pages (from-to)45-65
Number of pages21
JournalAnnals of Mathematics and Artificial Intelligence
Issue number1-2
StatePublished - Jun 8 2015


  • Conformal predictors
  • Face processing applications
  • Information fusion
  • Multiple hypothesis testing

ASJC Scopus subject areas

  • Artificial Intelligence
  • Applied Mathematics


Dive into the research topics of 'Conformal predictions for information fusion: A comparative study of p-value combination methods'. Together they form a unique fingerprint.

Cite this