Infinitely Divisible Noise in the Low Privacy Regime
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Fulltext
Final published version, 467 KB, PDF document
Federated learning, in which training data is distributed among users and never shared, has emerged as a popular approach to privacy-preserving machine learning. Cryptographic techniques such as secure aggregation are used to aggregate contributions, like a model update, from all users. A robust technique for making such aggregates differentially private is to exploit \emph{infinite divisibility} of the Laplace distribution, namely, that a Laplace distribution can be expressed as a sum of i.i.d. noise shares from a Gamma distribution, one share added by each user. However, Laplace noise is known to have suboptimal error in the low privacy regime for ε
-differential privacy, where ε>1
is a large constant. In this paper we present the first infinitely divisible noise distribution for real-valued data that achieves ε
-differential privacy and has expected error that decreases exponentially with ε
.
-differential privacy, where ε>1
is a large constant. In this paper we present the first infinitely divisible noise distribution for real-valued data that achieves ε
-differential privacy and has expected error that decreases exponentially with ε
.
Original language | English |
---|---|
Title of host publication | Proceedings of The 33rd International Conference on Algorithmic Learning Theory |
Publisher | PMLR |
Publication date | 2022 |
Pages | 881-909 |
Publication status | Published - 2022 |
Event | 33rd International Conference on Algorithmic Learning Theory (ALT 2022) - Paris, France Duration: 29 Mar 2022 → 1 Apr 2022 |
Conference
Conference | 33rd International Conference on Algorithmic Learning Theory (ALT 2022) |
---|---|
Land | France |
By | Paris |
Periode | 29/03/2022 → 01/04/2022 |
Series | Proceedings of Machine Learning Research |
---|---|
Volume | 167 |
ISSN | 2640-3498 |
Links
- https://proceedings.mlr.press/v167/pagh22a.html
Final published version
ID: 340697017