Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Pathryoshka: Compressing Pathology Foundation Models via Multi-Teacher Knowledge Distillation with Nested Embeddings
0
Zitationen
7
Autoren
2025
Jahr
Abstract
Pathology foundation models (FMs) have driven significant progress in computational pathology. However, these high-performing models can easily exceed a billion parameters and produce high-dimensional embeddings, thus limiting their applicability for research or clinical use when computing resources are tight. Here, we introduce Pathryoshka, a multi-teacher distillation framework inspired by RADIO distillation and Matryoshka Representation Learning to reduce pathology FM sizes while allowing for adaptable embedding dimensions. We evaluate our framework with a distilled model on ten public pathology benchmarks with varying downstream tasks. Compared to its much larger teachers, Pathryoshka reduces the model size by 86-92% at on-par performance. It outperforms state-of-the-art single-teacher distillation models of comparable size by a median margin of 7.0 in accuracy. By enabling efficient local deployment without sacrificing accuracy or representational richness, Pathryoshka democratizes access to state-of-the-art pathology FMs for the broader research and clinical community.
Ähnliche Arbeiten
A survey on deep learning in medical image analysis
2017 · 13.704 Zit.
Dermatologist-level classification of skin cancer with deep neural networks
2017 · 13.296 Zit.
A survey on Image Data Augmentation for Deep Learning
2019 · 11.890 Zit.
QuPath: Open source software for digital pathology image analysis
2017 · 8.267 Zit.
Radiomics: Images Are More than Pictures, They Are Data
2015 · 8.061 Zit.