OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 01.05.2026, 03:22

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Optimizing Deep Neural Networks via Knowledge Distillation for Efficient and Sustainable Medical Image Classification

2026·0 Zitationen·SPIRE - Sciences Po Institutional REpositoryOpen Access
Volltext beim Verlag öffnen

0

Zitationen

4

Autoren

2026

Jahr

Abstract

Deep learning has revolutionized computational medicine, yet its deployment remains limited by high demands for computational resources, memory, and energy. The growing complexity of AI models with hundreds of millions of parameters that depend on high-performance GPU clusters, hinders their implementation in resource-constrained environments where latency and data privacy are critical. This study aims to develop and validate a methodology that optimizes and compresses deep neural networks without compromising predictive performance, enabling their efficient and sustainable deployment on portable devices, edge systems, and low-power medical equipment. Knowledge Distillation (KD) transfer of knowledge from a robust and big "teacher" model to smaller, more efficient "student" model. In this study, DenseNet-121 served as the teacher and MobileNet-V2 as the student. The distilled MobileNetV2 achieved an AUC-ROC of 0.865 and standard deviation of 0.0010, outperforming the teacher (AUC-ROC of 0.858 and standard deviation of 0.0015) and the undistilled student (AUC-ROC of 0.824 and standard deviation of 0.0006) applying the Ones strategy, since it obtains the best results. The teacher model contained 6.96 M parameters (26.87 MB), while the student used 2.23 M parameters (8.64 MB), a threefold reduction in size. This compact model delivered comparable or superior accuracy with lower inference time and energy consumption, aligning with sustainability goals and enabling the deployment of robust medical AI in resource-limited environments. Code is available at: https://github.com/felixmejia/Knowledge_Distillation.

Ähnliche Arbeiten