OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 14.03.2026, 16:23

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Explainable AI for Multi-Label Chest X-ray Diagnosis: Layer-wise Grad-CAM with Hierarchical Feature Extraction

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

Artificial intelligence (AI) has become indispensable in medical image analysis, with models such as convolutional neural networks (CNNs) and Transformer achieving remarkable success in diagnostic imaging. Despite their impressive performance, these models often lack interpretability, limiting their adoption in clinical workflows where understanding disease-specific features is critical for trust.In this study, we propose an explainability framework that enhances interpretability for multi-label disease classification in chest X-ray (CXR) diagnosis by utilizing the U-Net encoder-decoder architecture. The encoder and decoder outputs are concatenated to effectively capture hierarchical features for the classification of 14 observations in the MIMIC-CXR dataset. To further improve interpretability, we apply gradient-weighted class activation mapping (Grad-CAM) across multiple layers, providing detailed insights into the refinement of hierarchical features and the emphasis on disease-specific regions throughout the network. This integration of U-Net with an explainable AI (XAI) framework enhances transparency in the diagnostic process, supporting more informed and trustworthy clinical decision making.Clinical relevance- This study underscores the importance of interpretability in AI-based radiology. By providing clear Grad-CAM visualizations of disease-specific features, clinicians can more confidently validate model predictions and incorporate these insights into their decision-making processes. Through enhanced transparency, our approach not only improves diagnostic performance, but also fosters greater trust in AI tools, paving the way for these models to serve as robust, clinician-friendly decision support systems in routine radiological workflows.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

COVID-19 diagnosis using AIExplainable Artificial Intelligence (XAI)Artificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen