Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Split learning for health: Distributed deep learning without sharing raw\n patient data
347
Zitationen
4
Autoren
2018
Jahr
Abstract
Can health entities collaboratively train deep learning models without\nsharing sensitive raw data? This paper proposes several configurations of a\ndistributed deep learning method called SplitNN to facilitate such\ncollaborations. SplitNN does not share raw data or model details with\ncollaborating institutions. The proposed configurations of splitNN cater to\npractical settings of i) entities holding different modalities of patient data,\nii) centralized and local health entities collaborating on multiple tasks and\niii) learning without sharing labels. We compare performance and resource\nefficiency trade-offs of splitNN and other distributed deep learning methods\nlike federated learning, large batch synchronous stochastic gradient descent\nand show highly encouraging results for splitNN.\n
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.615 Zit.
Coding Algorithms for Defining Comorbidities in ICD-9-CM and ICD-10 Administrative Data
2005 · 10.529 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.883 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.451 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.948 Zit.