Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Privacy Enhancing and Scalable Federated Learning to Accelerate AI Implementation in Cross-Silo and IoMT Environments
28
Zitationen
9
Autoren
2022
Jahr
Abstract
Federated Learning (FL) is a machine learning technique that enables to collaboratively learn valuable information across devices or sites without moving the data. In FL, the model is trained and shared across decentralized locations where data are privately owned. After local training, model updates are sent back to a central server, thus enabling access to distributed data on a large scale while maintaining privacy, security, and data access rights. Although FL is a well-studied topic, existing frameworks are still at an early stage of development. They encounter challenges with respect to scalability, data security, aggregation methodologies, data provenance, and production readiness. In this paper, we propose a novel FL framework that supports functionalities like scalable processing with respect of data, devices, sites and collaborators, monitoring services, privacy, and support for use cases. Furthermore, we integrate multi party computation (MPC) within the FL setup, preventing reverse engineering attacks. The proposed framework has been evaluated in diverse use cases both in cross-device and cross-silo settings. In the former case, in-device FL is leveraged in the context of an AI-driven internet of medical things (IoMT) environment. We demonstrate the framework suitability for a range of AI techniques while benchmarking with conventional centralized training. Furthermore, we prove the feasibility of developing a user-friendly pipeline that enables an efficient implementation of FL in diverse clinical use cases.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.396 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.872 Zit.
Deep Learning with Differential Privacy
2016 · 5.595 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.591 Zit.
Large-Scale Machine Learning with Stochastic Gradient Descent
2010 · 5.564 Zit.