Advances and Open Problems in Federated Learning

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

  • Peter Kairouz
  • H. Brendan McMahan
  • Brendan Avent
  • Aurelien Bellet
  • Mehdi Bennis
  • Arjun Nitin Bhagoji
  • Kallista Bonawitz
  • Zachary Charles
  • Graham Cormode
  • Rachel Cummings
  • Rafael G. L. D'Oliveira
  • Hubert Eichner
  • Salim El Rouayheb
  • David Evans
  • Josh Gardner
  • Zachary Garrett
  • Adria Gascon
  • Badih Ghazi
  • Phillip B. Gibbons
  • Marco Gruteser
  • Zaid Harchaoui
  • Chaoyang He
  • Lie He
  • Zhouyuan Huo
  • Ben Hutchinson
  • Justin Hsu
  • Martin Jaggi
  • Tara Javidi
  • Gauri Joshi
  • Mikhail Khodak
  • Jakub Konecny
  • Aleksandra Korolova
  • Farinaz Koushanfar
  • Sanmi Koyejo
  • Tancrede Lepoint
  • Yang Liu
  • Prateek Mittal
  • Mehryar Mohri
  • Richard Nock
  • Ayfer Ozgur
  • Hang Qi
  • Daniel Ramage
  • Ramesh Raskar
  • Mariana Raykova
  • Dawn Song
  • Weikang Song
  • Sebastian U. Stich
  • Ziteng Sun
  • Ananda Theertha Suresh
  • Florian Tramer
  • Praneeth Vepakomma
  • Jianyu Wang
  • Li Xiong
  • Zheng Xu
  • Qiang Yang
  • Felix X. Yu
  • Han Yu
  • Sen Zhao

Federated learning (FL) is a machine learning setting where many clients (e.g., mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g., service provider), while keeping the training data decentralized. FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches. Motivated by the explosive growth in FL research, this monograph discusses recent advances and presents an extensive collection of open problems and challenges.

OriginalsprogEngelsk
TidsskriftFoundations and Trends in Machine Learning
Vol/bind14
Udgave nummer1-2
Sider (fra-til)1-210
ISSN1935-8237
DOI
StatusUdgivet - 2021
Eksternt udgivetJa

ID: 301137251