Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Distilling Knowledge From Graph Convolutional Networks
229
Zitationen
5
Autoren
2020
Jahr
Abstract
Existing knowledge distillation methods focus on convolutional neural networks (CNNs), where the input samples like images lie in a grid domain, and have largely overlooked graph convolutional networks (GCN) that handle non-grid data. In this paper, we propose to our best knowledge the first dedicated approach to distilling knowledge from a pre-trained GCN model. To enable the knowledge transfer from the teacher GCN to the student, we propose a local structure preserving module that explicitly accounts for the topological semantics of the teacher. In this module, the local structure information from both the teacher and the student are extracted as distributions, and hence minimizing the distance between these distributions enables topology-aware knowledge transfer from the teacher, yielding a compact yet high-performance student model. Moreover, the proposed approach is readily extendable to dynamic graph models, where the input graphs for the teacher and the student may differ. We evaluate the proposed method on two different datasets using GCN models of different architectures, and demonstrate that our method achieves the state-of-the-art knowledge distillation performance for GCN models.
Ähnliche Arbeiten
Deep Residual Learning for Image Recognition
2016 · 216.020 Zit.
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 85.918 Zit.
ImageNet classification with deep convolutional neural networks
2017 · 75.547 Zit.
Very Deep Convolutional Networks for Large-Scale Image Recognition
2014 · 75.404 Zit.
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
2016 · 52.636 Zit.