Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
5.590
Zitationen
5
Autoren
2016
Jahr
Abstract
Modern mobile devices have access to a wealth of data suitable for learning\nmodels, which in turn can greatly improve the user experience on the device.\nFor example, language models can improve speech recognition and text entry, and\nimage models can automatically select good photos. However, this rich data is\noften privacy sensitive, large in quantity, or both, which may preclude logging\nto the data center and training there using conventional approaches. We\nadvocate an alternative that leaves the training data distributed on the mobile\ndevices, and learns a shared model by aggregating locally-computed updates. We\nterm this decentralized approach Federated Learning.\n We present a practical method for the federated learning of deep networks\nbased on iterative model averaging, and conduct an extensive empirical\nevaluation, considering five different model architectures and four datasets.\nThese experiments demonstrate the approach is robust to the unbalanced and\nnon-IID data distributions that are a defining characteristic of this setting.\nCommunication costs are the principal constraint, and we show a reduction in\nrequired communication rounds by 10-100x as compared to synchronized stochastic\ngradient descent.\n
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.389 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.864 Zit.
Deep Learning with Differential Privacy
2016 · 5.571 Zit.
Large-Scale Machine Learning with Stochastic Gradient Descent
2010 · 5.558 Zit.
Federated Machine Learning
2019 · 5.524 Zit.