OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 13.05.2026, 00:32

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Explainable AI for critical care: a systematic review of interpretable models for sepsis and ICU mortality prediction

2026·3 Zitationen·BMC Medical Informatics and Decision MakingOpen Access
Volltext beim Verlag öffnen

3

Zitationen

2

Autoren

2026

Jahr

Abstract

Sepsis is a leading cause of mortality in intensive care units (ICUs), and its rapid progression poses significant challenges for early detection. Traditional scoring systems, such as SOFA and APACHE II, provide clinical benchmarks but often fail to capture subtle early signs of deterioration. Machine learning (ML) and deep learning (DL) models have demonstrated strong predictive performance; however, their “black-box” nature limits transparency, clinician trust, and adoption in real-world ICU settings. Explainable artificial intelligence (XAI) clarifies how predictions are derived. This systematic review examines studies published between 2020 and 2025 that applied XAI methods, including SHAP, LIME, Grad-CAM, and sensitivity analysis, to predict sepsis onset and ICU mortality. We analyzed the datasets used (e.g., MIMIC-III/IV, Emory University Hospital, Ruijin Hospital), model architectures, interpretability strategies, and the clinical features most strongly associated with predictions. Findings indicate that XAI-enhanced models not only maintain high predictive accuracy but also highlight clinically meaningful indicators such as respiratory rate, blood urea nitrogen (BUN), urine output, and the Glasgow Coma Scale (GCS), thereby improving clinician confidence and facilitating adoption. Despite these advances, challenges remain, including limited prospective evaluation, inconsistent interpretability metrics, and variable integration into clinical workflows. We conclude by recommending future research priorities, including real-world validation, user-centered design, and multimodal data integration, to ensure that XAI can reliably support timely and informed decision-making in critical care environments.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Sepsis Diagnosis and TreatmentMachine Learning in HealthcareArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen