Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Hybrid intelligence decision making: successful human-AI integration in optical diagnosis
1
Zitationen
6
Autoren
2022
Jahr
Abstract
Abstract Artificial Intelligence (AI) systems are precious support for decision-making, with many applications in the medical domain. However, there is little understanding of how human experts interact with AI. Health policy-makers fear flat reliance on AI advice. In this multicentric study, twenty-one endoscopists reviewed 504 videos of lesions from real colonoscopies, with and without the assistance of an AI support system. Endoscopists were influenced by AI (OR = 3.05), but not erratically: they followed the AI advice more when it was correct (OR = 3.48) than incorrect (OR = 1.85). Endoscopists achieved this outcome through a weighted integration of their and the AI opinions, considering the case-by-case estimations of the two reliabilities. This Bayesian-like rational behavior allowed the human-AI hybrid team to outperform both agents taken alone. We discuss the features of the interaction that determined this favorable outcome.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.339 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.211 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.614 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.478 Zit.