OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 08.05.2026, 04:05

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Enhancing AI Clinical Decision Support Trust: Design Workshop Insights from General Practitioners

2025·3 Zitationen·Studies in health technology and informaticsOpen Access
Volltext beim Verlag öffnen

3

Zitationen

2

Autoren

2025

Jahr

Abstract

Artificial Intelligence (AI) predictive models are increasingly integrated into Clinical Decision Support Systems (CDSS). However, real-world implementation is lagging due to a lack of trust and acceptance by healthcare providers. To investigate factors related to trust and acceptance of AI-based CDSS, we conducted a workshop with General Practitioners to explore user, model, and organizational factors that could affect trust and recommendation acceptance. The workshop discussions revealed that while explainability is crucial for understanding AI decisions, current explainable AI (XAI) visualizations appeared to provide limited value to clinicians without technical backgrounds. Participants suggested that transparency in model training information, including data sources and elements, combined with clear alignment between model decisions and established clinical research findings and guidelines, would enhance their trust and acceptance of AI-based recommendations. These insights provide valuable direction for developing user interfaces that enhance clinician trust and acceptance of AI-based CDSS in clinical practice.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Explainable Artificial Intelligence (XAI)Artificial Intelligence in Healthcare and EducationMachine Learning in Healthcare
Volltext beim Verlag öffnen