Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Explainable Clinical Decision Support Systems for Post-COVID Care Pathways
0
Zitationen
2
Autoren
2021
Jahr
Abstract
Clinical decision support systems have become a foundational component of modern healthcare delivery, particularly in contexts where patient trajectories are complex, uncertain, and long running. Post COVID care represents such a setting, characterised by heterogeneous symptoms, fluctuating recovery patterns, and multi organ involvement. While machine learning driven decision support systems demonstrate strong predictive capability, their limited transparency poses barriers to clinical trust, accountability, and safe adoption. This article presents a comprehensive framework for explainable clinical decision support tailored to post COVID care pathways. The proposed approach integrates interpretable predictive modelling, uncertainty aware inference, and clinician oriented explanation layers to support decision making across diagnosis, monitoring, and care planning. Through systematic architectural design and empirical evaluation, the study demonstrates how explainability can be embedded as a core system property rather than an afterthought, enabling reliable and actionable clinical insights in complex care environments
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.198 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.576 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.084 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.444 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.382 Zit.