Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Key information influencing patient decision-making about AI in healthcare – A survey experiment
0
Zitationen
8
Autoren
2025
Jahr
Abstract
Background: Artificial Intelligence (AI)-enabled devices are increasingly used in healthcare. However, there has been limited research on patients’ informational preferences, including which elements of AI device labeling enhance patient understanding, trust, and acceptance. Clear and effective patient-facing communication is essential to address patient concerns and support informed decision-making regarding AI-enabled care.Objective: Using simulated AI device labels in a cardiovascular context, we evaluated three aims. First, we identified key information elements that influence patient trust and acceptance of an AI device. Second, we examined how these effects varied based on patient characteristics. Third, we explored how patients evaluated informational content of AI labels and their perceived effectiveness of the AI labels in informing decision-making about the use of AI device, building trust in the device, and shaping their intention to use it in their healthcare.Methods: We recruited 340 US patients from ResearchMatch.org to participate in a web-based survey that contained two experiments. In the discrete choice experiment (DCE), participants indicated preferences in terms of trust and acceptance regarding 16 pairs of simulated AI device labels that varied across eight types of information needs identified in our previous qualitative work. In the single profile factorial experiment (SPFE), participants evaluated four randomly assigned label prototypes regarding the label’s legibility, comprehensibility, information overload, credibility, and perceived effectiveness in informing about the AI device, as well as participants’ trust in the AI device and intention to use the device in their healthcare. Data was analyzed using mixed effects binary or ordinal logistic regression. Results: The DCE showed that information about regulatory approval, high device performance, provider oversight, and AI’s value added to usual care significantly increased the likelihood of patient trust by 14.1-19.3% and acceptance by 13.3-17.9%. Subgroup analyses revealed variations based on patient characteristics such as familiarity with AI, health literacy, and recency of last medical checkup. The SPFE showed that patients reported good label comprehension, and that information about provider oversight, regulatory approval, device performance, and AI’s added value improved perceived credibility and effectiveness of the AI label (odds ratios [ORs] range 1.35-2.05), reduced doubts in the AI device (ORs range 0.61- 0.77), and increased trust and intention to use the AI device (ORs range 1.47-1.73). However, information about data privacy and safety management protocols are less influential. Conclusion: Patients value information about an AI device’s performance, provider oversight, regulatory status, and added value during decision-making. Providing transparent, easily understandable information about these aspects is critical to support patient determinations of trust and acceptance of AI-enabled healthcare. Information elements’ impact on patient trust and acceptance varies by patient characteristics, highlighting the need for a tailored approach to address the concerns of diverse patient groups about AI in healthcare.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.245 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.100 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.466 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.429 Zit.