Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Optimizing Deep Neural Networks via Knowledge Distillation for Efficient and Sustainable Medical Image Classification
0
Zitationen
4
Autoren
2026
Jahr
Abstract
Deep learning has revolutionized computational medicine, yet its deployment remains limited by high demands for computational resources, memory, and energy. The growing complexity of AI models with hundreds of millions of parameters that depend on high-performance GPU clusters, hinders their implementation in resource-constrained environments where latency and data privacy are critical. This study aims to develop and validate a methodology that optimizes and compresses deep neural networks without compromising predictive performance, enabling their efficient and sustainable deployment on portable devices, edge systems, and low-power medical equipment. Knowledge Distillation (KD) transfer of knowledge from a robust and big "teacher" model to smaller, more efficient "student" model. In this study, DenseNet-121 served as the teacher and MobileNet-V2 as the student. The distilled MobileNetV2 achieved an AUC-ROC of 0.865 and standard deviation of 0.0010, outperforming the teacher (AUC-ROC of 0.858 and standard deviation of 0.0015) and the undistilled student (AUC-ROC of 0.824 and standard deviation of 0.0006) applying the Ones strategy, since it obtains the best results. The teacher model contained 6.96 M parameters (26.87 MB), while the student used 2.23 M parameters (8.64 MB), a threefold reduction in size. This compact model delivered comparable or superior accuracy with lower inference time and energy consumption, aligning with sustainability goals and enabling the deployment of robust medical AI in resource-limited environments. Code is available at: https://github.com/felixmejia/Knowledge_Distillation.
Ähnliche Arbeiten
Deep Residual Learning for Image Recognition
2016 · 218.745 Zit.
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 87.249 Zit.
ImageNet classification with deep convolutional neural networks
2017 · 75.672 Zit.
Very Deep Convolutional Networks for Large-Scale Image Recognition
2014 · 75.502 Zit.
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
2016 · 53.338 Zit.