OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 06.04.2026, 22:57

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

COVID-Net Clinical ICU: Enhanced Prediction of ICU Admission for\n COVID-19 Patients via Explainability and Trust Quantification

2021·2 Zitationen·arXiv (Cornell University)Open Access
Volltext beim Verlag öffnen

2

Zitationen

4

Autoren

2021

Jahr

Abstract

The COVID-19 pandemic continues to have a devastating global impact, and has\nplaced a tremendous burden on struggling healthcare systems around the world.\nGiven the limited resources, accurate patient triaging and care planning is\ncritical in the fight against COVID-19, and one crucial task within care\nplanning is determining if a patient should be admitted to a hospital's\nintensive care unit (ICU). Motivated by the need for transparent and\ntrustworthy ICU admission clinical decision support, we introduce COVID-Net\nClinical ICU, a neural network for ICU admission prediction based on patient\nclinical data. Driven by a transparent, trust-centric methodology, the proposed\nCOVID-Net Clinical ICU was built using a clinical dataset from Hospital\nSirio-Libanes comprising of 1,925 COVID-19 patient records, and is able to\npredict when a COVID-19 positive patient would require ICU admission with an\naccuracy of 96.9% to facilitate better care planning for hospitals amidst the\non-going pandemic. We conducted system-level insight discovery using a\nquantitative explainability strategy to study the decision-making impact of\ndifferent clinical features and gain actionable insights for enhancing\npredictive performance. We further leveraged a suite of trust quantification\nmetrics to gain deeper insights into the trustworthiness of COVID-Net Clinical\nICU. By digging deeper into when and why clinical predictive models makes\ncertain decisions, we can uncover key factors in decision making for critical\nclinical decision support tasks such as ICU admission prediction and identify\nthe situations under which clinical predictive models can be trusted for\ngreater accountability.\n

Ähnliche Arbeiten

Autoren

Themen

Machine Learning in HealthcareArtificial Intelligence in Healthcare and EducationCOVID-19 diagnosis using AI
Volltext beim Verlag öffnen