Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
When Neural Network Architecture Search Meets Federated Learning Parameter Efficient Fine Tuning
2
Zitationen
2
Autoren
2023
Jahr
Abstract
With the increasing concerns regarding data privacy, federated learning has emerged as a promising approach for collaborative training of deep learning models while preserving local data. Fine tune pre-trained models for downstream tasks has proven to be effective in the field of deep learning. However, the implementation of federated fine tuning encounters various obstacles including extensive communication overhead, high computational expenses, and potential privacy leakage. To address these challenges, our study integrates existing Parameter Efficient fine Tuning (PET) methods with Neural Architecture Search (Nas) under the context of federated learning(FL) to achieve high performance at a cost-effective level. Our proposed algorithm, FedNasPET, can effectively identify PET structures with low resource requirements for diverse tasks or datasets, while maintaining privacy protection. Experimental results demonstrate that the structure discovered by FedNasPET remarkably achieves low communication costs (less than 0.016%) compared to the manually designed structure, while maintaining over 98.1% of the federated full fine tuning performance. Additionally, FedNasPET also enhances the privacy protection capability by up to 81.5%. These findings emphasize the notable potential of the FedNasPET structure in academic research.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.401 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.885 Zit.
Deep Learning with Differential Privacy
2016 · 5.611 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.593 Zit.
Large-Scale Machine Learning with Stochastic Gradient Descent
2010 · 5.570 Zit.