A Smooth Binary Mechanism for Efficient Private Continual Observation
Research output: Contribution to conference › Paper › Research
Standard
A Smooth Binary Mechanism for Efficient Private Continual Observation. / Andersson, Joel Daniel; Pagh, Rasmus.
2023. Paper presented at 37th Conference on Neural Information Processing Systems - NeurIPS 2023, New Orleans., United States.Research output: Contribution to conference › Paper › Research
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - CONF
T1 - A Smooth Binary Mechanism for Efficient Private Continual Observation
AU - Andersson, Joel Daniel
AU - Pagh, Rasmus
PY - 2023
Y1 - 2023
N2 - In privacy under continual observation we study how to release differentiallyprivate estimates based on a dataset that evolves over time. The problem ofreleasing private prefix sums of x1, x2, x3, · · · ∈ {0, 1} (where the value of eachxiis to be private) is particularly well-studied, and a generalized form is used instate-of-the-art methods for private stochastic gradient descent (SGD). The seminalbinary mechanism privately releases the first t prefix sums with noise of variancepolylogarithmic in t. Recently, Henzinger et al. and Denisov et al. showed that itis possible to improve on the binary mechanism in two ways: The variance of thenoise can be reduced by a (large) constant factor, and also made more even acrosstime steps. However, their algorithms for generating the noise distribution arenot as efficient as one would like in terms of computation time and (in particular)space. We address the efficiency problem by presenting a simple alternative to thebinary mechanism in which 1) generating the noise takes constant average timeper value, 2) the variance is reduced by a factor about 4 compared to the binarymechanism, and 3) the noise distribution at each step is identical. Empirically, asimple Python implementation of our approach outperforms the running time ofthe approach of Henzinger et al., as well as an attempt to improve their algorithmusing high-performance algorithms for multiplication with Toeplitz matrices.
AB - In privacy under continual observation we study how to release differentiallyprivate estimates based on a dataset that evolves over time. The problem ofreleasing private prefix sums of x1, x2, x3, · · · ∈ {0, 1} (where the value of eachxiis to be private) is particularly well-studied, and a generalized form is used instate-of-the-art methods for private stochastic gradient descent (SGD). The seminalbinary mechanism privately releases the first t prefix sums with noise of variancepolylogarithmic in t. Recently, Henzinger et al. and Denisov et al. showed that itis possible to improve on the binary mechanism in two ways: The variance of thenoise can be reduced by a (large) constant factor, and also made more even acrosstime steps. However, their algorithms for generating the noise distribution arenot as efficient as one would like in terms of computation time and (in particular)space. We address the efficiency problem by presenting a simple alternative to thebinary mechanism in which 1) generating the noise takes constant average timeper value, 2) the variance is reduced by a factor about 4 compared to the binarymechanism, and 3) the noise distribution at each step is identical. Empirically, asimple Python implementation of our approach outperforms the running time ofthe approach of Henzinger et al., as well as an attempt to improve their algorithmusing high-performance algorithms for multiplication with Toeplitz matrices.
M3 - Paper
T2 - 37th Conference on Neural Information Processing Systems - NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -
ID: 384252722