Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Exploring and Promoting Diagnostic Transparency and Explainability in Online Symptom Checkers
74
Zitationen
5
Autoren
2021
Jahr
Abstract
Online symptom checkers (OSC) are widely used intelligent systems in health contexts such as primary care, remote healthcare, and epidemic control. OSCs use algorithms such as machine learning to facilitate self-diagnosis and triage based on symptoms input by healthcare consumers. However, intelligent systems’ lack of transparency and comprehensibility could lead to unintended consequences such as misleading users, especially in high-stakes areas such as healthcare. In this paper, we attempt to enhance diagnostic transparency by augmenting OSCs with explanations. We first conducted an interview study (N=25) to specify user needs for explanations from users of existing OSCs. Then, we designed a COVID-19 OSC that was enhanced with three types of explanations. Our lab-controlled user study (N=20) found that explanations can significantly improve user experience in multiple aspects. We discuss how explanations are interwoven into conversation flow and present implications for future OSC designs.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.316 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.177 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.575 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.468 Zit.