OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 12.03.2026, 10:16

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Contractible Regularization for Federated Learning on Non-IID Data

2022·11 Zitationen·2022 IEEE International Conference on Data Mining (ICDM)
Volltext beim Verlag öffnen

11

Zitationen

7

Autoren

2022

Jahr

Abstract

In the medical domain, gathering all data and training a global supervised model is very difficult due to scattered data from different hospitals and security and privacy concerns. In recent years, several federated learning models have been proposed for training over isolated data. These models usually employ a client-server framework: 1) train local models on clients in parallel; 2) aggregate local models on the server to produce a global one. By iterating the above two steps, federated learning aims to approximate the performance of a model centrally trained on data. However, due to the non-IID data distribution issue, local models could deviate from the optimal model resulting in a biased aggregated global model. To address this problem, we propose a contractible regularization (ConTre) to act on the local model’s latent space. On each client, we first project the input data into a latent space and then pose regularization to avoid converging too fast to bad local optima. The proposed regularization can be easily integrated into existing federated learning frameworks without bringing in additional parameters. According to experimental results on multiple natural and medical image datasets, the proposed ConTre can significantly improve the performance of various federated learning frameworks. Our code is available at https://github.com/czifan/ConTre.pytorch.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Privacy-Preserving Technologies in DataCOVID-19 diagnosis using AIArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen