OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 25.03.2026, 10:04

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

When Neural Network Architecture Search Meets Federated Learning Parameter Efficient Fine Tuning

2023·2 Zitationen
Volltext beim Verlag öffnen

2

Zitationen

2

Autoren

2023

Jahr

Abstract

With the increasing concerns regarding data privacy, federated learning has emerged as a promising approach for collaborative training of deep learning models while preserving local data. Fine tune pre-trained models for downstream tasks has proven to be effective in the field of deep learning. However, the implementation of federated fine tuning encounters various obstacles including extensive communication overhead, high computational expenses, and potential privacy leakage. To address these challenges, our study integrates existing Parameter Efficient fine Tuning (PET) methods with Neural Architecture Search (Nas) under the context of federated learning(FL) to achieve high performance at a cost-effective level. Our proposed algorithm, FedNasPET, can effectively identify PET structures with low resource requirements for diverse tasks or datasets, while maintaining privacy protection. Experimental results demonstrate that the structure discovered by FedNasPET remarkably achieves low communication costs (less than 0.016%) compared to the manually designed structure, while maintaining over 98.1% of the federated full fine tuning performance. Additionally, FedNasPET also enhances the privacy protection capability by up to 81.5%. These findings emphasize the notable potential of the FedNasPET structure in academic research.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Privacy-Preserving Technologies in DataArtificial Intelligence in Healthcare and EducationStochastic Gradient Optimization Techniques
Volltext beim Verlag öffnen