Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Communication-efficient federated learning via knowledge distillation
544
Zitationen
5
Autoren
2022
Jahr
Abstract
Federated learning is a privacy-preserving machine learning technique to train intelligent models from decentralized data, which enables exploiting private data by communicating local model updates in each iteration of model learning rather than the raw data. However, model updates can be extremely large if they contain numerous parameters, and many rounds of communication are needed for model training. The huge communication cost in federated learning leads to heavy overheads on clients and high environmental burdens. Here, we present a federated learning method named FedKD that is both communication-efficient and effective, based on adaptive mutual knowledge distillation and dynamic gradient compression techniques. FedKD is validated on three different scenarios that need privacy protection, showing that it maximally can reduce 94.89% of communication cost and achieve competitive results with centralized model learning. FedKD provides a potential to efficiently deploy privacy-preserving intelligent systems in many scenarios, such as intelligent healthcare and personalization.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.451 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.968 Zit.
Deep Learning with Differential Privacy
2016 · 5.759 Zit.
Federated Machine Learning
2019 · 5.734 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.613 Zit.