Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Generative Artificial Intelligence for Clinical Communication: Implications for Non-Pharmacological Interventions in Health Care
2
Zitationen
5
Autoren
2023
Jahr
Abstract
Abstract Objectives The objective of this study was to evaluate the potential of Generative Artificial Intelligence (AI) in facilitating clinical communication, particularly in addressing sexual health concerns, which are often challenging for patients to discuss. Methods We employed the Generative Pre-trained Transformer-3.5 (GPT) as the generative AI platform and utilized DocsBot for citation retrieval (June 2023). A structured prompt was devised to generate 100 questions from the AI, based on epidemiological survey data regarding sexual difficulties among cancer survivors. These questions were submitted to Bot1 (standard GPT) and Bot2 (sourced from two clinical guidelines). The responses from both bots were compared to assess consistency and adherence to clinical guidelines. Results Our analysis revealed no censorship of sexual expressions or medical terms. The most common themes among the generated questions were cancer treatment, sexual health, and advice. The similarity rate between responses from Bot1 and Bot2 averaged 92.5% (range 77.0% to 98.4%), with notably lower similarity for items not covered in the guidelines. Despite the lack of reflection on guideline recommendations, counseling and other non-pharmacological interventions were significantly more prevalent in both bots’ responses compared to drug interventions, with odds ratios of 4.8 (p=0.04) in Bot1 and 14.9 (p<0.001) in Bot2. Discussion Generative AI can serve for providing health information on sensitive topics such as sexual health, despite the potential for policy-restricted content. There was a significant skew towards non-pharmacological interventions in responses, possibly due to the prohibitive nature of medical topics. This shift warrants attention as it could potentially trigger patients’ expectations for non-pharmacological interventions.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.460 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.341 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.791 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.536 Zit.