Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Explainable Knowledge Distillation for On-Device Chest X-Ray Classification
26
Zitationen
5
Autoren
2023
Jahr
Abstract
Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7%, 87.1%, and 88.7% on the ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.
Ähnliche Arbeiten
La certeza de lo impredecible: Cultura Educación y Sociedad en tiempos de COVID19
2020 · 19.284 Zit.
A Multi-Modal Distributed Real-Time IoT System for Urban Traffic Control (Invited Paper)
2024 · 14.297 Zit.
UNet++: A Nested U-Net Architecture for Medical Image Segmentation
2018 · 8.772 Zit.
Review of deep learning: concepts, CNN architectures, challenges, applications, future directions
2021 · 7.387 Zit.
scikit-image: image processing in Python
2014 · 6.823 Zit.