Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
AI-Enabled Clinical Decision Support Software: A “Trust and Value Checklist” for Clinicians
20
Zitationen
3
Autoren
2020
Jahr
Abstract
SummaryMachine learning and other forms of artificial intelligence (AI) are playing an increasing role in health care, particularly as an addition to human judgment in the form of clinical decision support (CDS). But as with all technologies, machine learning and AI will also have unintended consequences that could disrupt care and pose considerable risks for patients. It is vitally important that clinicians understand what is behind the recommendations that a CDS system offers and that any such system adds real value and enables clinicians to perform more effectively and efficiently in serving the needs of patients. This article presents a "trust and value checklist" that is aimed not at senior health system leadership, but rather at the clinicians who will be using these systems. The questions that the checklist poses include both those that the clinicians should be considering themselves and some that they will want to make sure that their leadership has addressed when making system selections. All of these questions should be considered, and answered to clinicians' satisfaction, before they start using and relying on CDS.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.292 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.143 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.539 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.452 Zit.