Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Learning to Think with AI: A Survey on Health Professional Students’ Use of Generative AI During Clinical Placements (Preprint)
0
Zitationen
5
Autoren
2025
Jahr
Abstract
<sec> <title>BACKGROUND</title> Generative artificial intelligence (GenAI) has rapidly expanded in higher education and clinical practice. Large language models such as ChatGPT are widely adopted by health profession students for learning and writing tasks. However, little is known about how these tools are mobilized during clinical placements, a critical stage of training where students face high cognitive demands and increasing responsibility for patient care. </sec> <sec> <title>OBJECTIVE</title> This study aimed to map self-reported uses of GenAI during clinical placements, assess perceived benefits and risks, and identify training and governance needs. </sec> <sec> <title>METHODS</title> We conducted a cross-sectional online survey at Université Grenoble Alpes (France) from June to September 2025. Eligible participants were students in medicine, pharmacy, nursing, midwifery, or physiotherapy who were currently in, or had completed within the past 18 months, a clinical placement. The 61-item questionnaire included closed and open-ended questions. A composite maturity score classified respondents as Minimal, Limited, Moderate, or High. Descriptive statistics and trend tests were used for analysis. </sec> <sec> <title>RESULTS</title> Among 388 respondents (79% female; 56% nursing, 18% medicine, 17% pharmacy, 6% midwifery, 3% physiotherapy), 53% reported using GenAI during placements. Uptake was lowest in midwifery (26%) and rose markedly with maturity (9% Minimal vs 76% High; P<.001). Students mainly used GenAI for information retrieval (78%), bibliographic search (75%), and translation/rephrasing (71%). Clinical-facing tasks such as case simulation (55%), drafting patient documents (38%), or preparing patient communication (38%) were less frequent, with fewer than 15% reporting weekly use. Most students avoided entering patient identifiers, but 23% acknowledged at least one disclosure, and 47% reported sharing anonymized medical data. Benefits were most often perceived for documentation support (81%) and information access (69%). Risks included dependence (91%), erosion of skills (85%), and confidentiality breaches (87%). Students highlighted strong needs for ethics/regulation training (78%), best-practice guidance (78%), profession-specific coaching (74%), and human–AI collaboration (73%). </sec> <sec> <title>CONCLUSIONS</title> GenAI is already embedded in the daily practices of health profession students during placements, primarily as a tool for documentation and information management. While students recognize its utility, they also express concerns about dependence, skills, and confidentiality. These findings underscore the urgent need for structured curricula and governance frameworks to support responsible and patient-centered integration of GenAI into clinical education. </sec>
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.357 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.221 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.640 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.482 Zit.