Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
AI-Supported Documentation and Clinical Monitoring in Palliative Care: A Real-World Observational Study
4
Zitationen
2
Autoren
2025
Jahr
Abstract
BackgroundPatients in palliative care often experience prolonged hospital stays, requiring detailed documentation, complex symptom management, and multidisciplinary coordination. The emergence of AI tools, particularly GPT-based language models, offers new opportunities to support clinical workflows.ObjectiveTo evaluate the practical utility of a GPT-based AI tool in supporting documentation, trend recognition, and clinical decision support in a real-world palliative care unit.MethodsThis retrospective observational study included 25 patients admitted to a hospital-based palliative care unit during April 2025. The AI tool was used to assist with clinical documentation (daily notes and discharge summaries), drug monitoring, and recognition of clinical trends. Physicians entered patient data through a text-based interface, and the AI generated draft documentation, which was then reviewed and finalized by attending clinicians. All therapeutic suggestions were verified through specialist consultation.ResultsAI assistance reduced documentation time significantly-from an average of 20.4 ± 5.6 min to 6.1 ± 1.8 min for discharge summaries. In eight patients, the AI flagged important trends, such as rising CRP levels, prompting earlier re-evaluation. The tool also provided medication suggestions in six cases, all confirmed by internal medicine specialists. Physicians reported reduced cognitive load and improved clarity in clinical records. Communication with families was enhanced using AI-generated educational summaries.ConclusionGPT-based AI tools can improve documentation efficiency and clinical awareness in palliative care units. While not a replacement for physician judgment, they provide valuable support in managing complex, long-term patients when used under appropriate clinical supervision.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.250 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.109 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.482 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.434 Zit.