OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 15.03.2026, 10:43

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Personalized Federated Learning with Semisupervised Distillation

2021·5 Zitationen·Security and Communication NetworksOpen Access
Volltext beim Verlag öffnen

5

Zitationen

4

Autoren

2021

Jahr

Abstract

Heterogeneous data and models pose critical challenges for federated learning. However, the traditional federated learning framework, which trains the global model by transferring model parameters, has major limitations; it requires that all participants have the same training model architectures, and the trained global model does not guarantee accurate projections for participants’ personal data. To solve this problem, we propose a new federal framework named personalized federated learning with semisupervised distillation (pFedSD), which ensures the privacy of the participants’ model architectures and improves the communication efficiency by transmitting the model’s predicted class distribution rather than model parameters. First, the server adopts the adaptive aggregation method to reduce the weight of low-quality model predictions for the model’s predicted class distributions uploaded by all clients, which helps to improve the quality of the aggregation of the prediction class distribution. Then, the server sends it back to the clients for local training to obtain the personalized model. We finally conducted experiments on different datasets (MNIST, FMNIST, and CIFAR10), and the results show that the model performance of pFedSD exceeds the latest federated distillation algorithms.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Privacy-Preserving Technologies in DataArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen