Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Privacy-first health research with federated learning
24
Zitationen
16
Autoren
2020
Jahr
Abstract
Abstract Privacy protection is paramount in conducting health research. However, studies often rely on data stored in a centralized repository, where analysis is done with full access to the sensitive underlying content. Recent advances in federated learning enable building complex machine-learned models that are trained in a distributed fashion. These techniques facilitate the calculation of research study endpoints such that private data never leaves a given device or healthcare system. We show on a diverse set of health studies that federated models can achieve the same level of accuracy, precision, and generalizability, and result in the same interpretation as standard centralized statistical models whilst achieving significantly stronger privacy protections. This work is the first to apply modern and general federated learning methods to clinical and epidemiological research -- across a spectrum of units of federation and model architectures. As a result, it enables health research participants to remain in control of their data and still contribute to advancing science -- aspects that used to be at odds with each other.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.445 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.954 Zit.
Deep Learning with Differential Privacy
2016 · 5.722 Zit.
Federated Machine Learning
2019 · 5.697 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.606 Zit.