Evaluating the Impact of Uncertainty Visualization on Model Reliance

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Machine learning models have gained traction as decision support tools for tasks that require processing copious amounts of data. However, to achieve the primary benefits of automating this part of decision-making, people must be able to trust the machine learning model's outputs. In order to enhance people's trust and promote appropriate reliance on the model, visualization techniques such as interactive model steering, performance analysis, model comparison, and uncertainty visualization have been proposed. In this study, we tested the effects of two uncertainty visualization techniques in a college admissions forecasting task, under two task difficulty levels, using Amazon's Mechanical Turk platform. Results show that (1) people's reliance on the model depends on the task difficulty and level of machine uncertainty and (2) ordinal forms of expressing model uncertainty are more likely to calibrate model usage behavior. These outcomes emphasize that reliance on decision support tools can depend on the cognitive accessibility of the visualization technique and perceptions of model performance and task difficulty.

Original languageEnglish (US)
Pages (from-to)1-15
Number of pages15
JournalIEEE Transactions on Visualization and Computer Graphics
StateAccepted/In press - 2023


  • Computational modeling
  • Data models
  • Data visualization
  • human-machine collaborations
  • model reliance
  • Prediction algorithms
  • Predictive models
  • Task analysis
  • trust
  • Uncertainty
  • Uncertainty

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Evaluating the Impact of Uncertainty Visualization on Model Reliance'. Together they form a unique fingerprint.

Cite this