Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Investigating the Role of AI Explanations in Lay Individuals’ Comprehension of Radiology Reports: A Metacognition Lense
0
Zitationen
3
Autoren
2025
Jahr
Abstract
Much research has focused on advancing techniques for explainable artificial intelligence (XAI) to improve the utility of AI recommendations. However, the metacognitive processes involved in interacting with AI explanations have not been fully explored. In this study, we examine the effects of AI explanations on human decisions from the perspective of cognitive mechanisms that evaluate the correctness of AI recommendations. To accomplish this, we conducted a large-scale, between-subject experiment (N=4,302) on Amazon Mechanical Turk, during which each participant was asked to classify a radiology report as describing a normal or abnormal finding. The participants were randomly assigned into three different groups: a) without accompanying AI input (control group,) b) with AI prediction only, and c) with AI prediction and AI explanation. Our results show that AI explanations improved the overall task performance. We hypothesize that explanations help decision-makers better evaluate their intuitions about their decisions—a process known as self-monitoring—and, as such, overcome their cognitive limitations and compensate for machine prediction errors. Additionally, our results show that explanations are more effective when AI prediction confidences are high or users' self-confidence is low. We conclude this paper by discussing the theoretical and practical implications of our findings.
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.284 Zit.
Generative Adversarial Nets
2023 · 19.841 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.233 Zit.
"Why Should I Trust You?"
2016 · 14.179 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.096 Zit.