Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Do We Have Enough Information? Assessing AI Clinical Decision Support Systems for Implementation in Primary Care
0
Zitationen
6
Autoren
2026
Jahr
Abstract
This environmental scan examined commercially available AI clinical decision support solutions (AI-CDSS) across three domains: knowledge base, AI methodology, and privacy. Over half of vendors disclosed some information on their knowledge base, yet few demonstrated rigorous appraisal or alignment with Quality Standards or other evidence-based guidelines. Transparency on AI methods was limited as most cited proprietary algorithms but rarely described training data. Privacy information was more commonly reported but often high-level, with limited detail on compliance, storage location, or restrictions on secondary use. These gaps reveal the obstacles facing decision makers: without standardized, transparent information, organizations and governments cannot reliably evaluate AI-CDSS or provide the support clinicians need for responsible and informed implementation.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.402 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.270 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.702 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.507 Zit.