Improved Utility Analysis of Private CountSketch

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Improved Utility Analysis of Private CountSketch. / Pagh, Rasmus; Thorup, Mikkel.

Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, 2022. (Advances in Neural Information Processing Systems, Vol. 35).

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Pagh, R & Thorup, M 2022, Improved Utility Analysis of Private CountSketch. in Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, Advances in Neural Information Processing Systems, vol. 35, 36th Conference on Neural Information Processing Systems (NeurIPS 2022)., New Orleans/ Virtual, United States, 28/11/2022.

APA

Pagh, R., & Thorup, M. (2022). Improved Utility Analysis of Private CountSketch. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022) NeurIPS Proceedings. Advances in Neural Information Processing Systems Vol. 35

Vancouver

Pagh R, Thorup M. Improved Utility Analysis of Private CountSketch. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings. 2022. (Advances in Neural Information Processing Systems, Vol. 35).

Author

Pagh, Rasmus ; Thorup, Mikkel. / Improved Utility Analysis of Private CountSketch. Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, 2022. (Advances in Neural Information Processing Systems, Vol. 35).

Bibtex

@inproceedings{48d9b2275d99494d9b9f40ce8472dffb,
title = "Improved Utility Analysis of Private CountSketch",
abstract = "Sketching is an important tool for dealing with high-dimensional vectors that are sparse (or well-approximated by a sparse vector), especially useful in distributed, parallel, and streaming settings.It is known that sketches can be made differentially private by adding noise according to the sensitivity of the sketch, and this has been used in private analytics and federated learning settings.The post-processing property of differential privacy implies that \emph{all} estimates computed from the sketch can be released within the given privacy budget.In this paper we consider the classical CountSketch, made differentially private with the Gaussian mechanism, and give an improved analysis of its estimation error.Perhaps surprisingly, the privacy-utility trade-off is essentially the best one could hope for, independent of the number of repetitions in CountSketch:The error is almost identical to the error from non-private CountSketch plus the noise needed to make the vector private in the original, high-dimensional domain.",
author = "Rasmus Pagh and Mikkel Thorup",
year = "2022",
language = "English",
series = "Advances in Neural Information Processing Systems",
publisher = "NeurIPS Proceedings",
booktitle = "Advances in Neural Information Processing Systems 35 (NeurIPS 2022)",
note = "null ; Conference date: 28-11-2022 Through 09-12-2022",

}

RIS

TY - GEN

T1 - Improved Utility Analysis of Private CountSketch

AU - Pagh, Rasmus

AU - Thorup, Mikkel

PY - 2022

Y1 - 2022

N2 - Sketching is an important tool for dealing with high-dimensional vectors that are sparse (or well-approximated by a sparse vector), especially useful in distributed, parallel, and streaming settings.It is known that sketches can be made differentially private by adding noise according to the sensitivity of the sketch, and this has been used in private analytics and federated learning settings.The post-processing property of differential privacy implies that \emph{all} estimates computed from the sketch can be released within the given privacy budget.In this paper we consider the classical CountSketch, made differentially private with the Gaussian mechanism, and give an improved analysis of its estimation error.Perhaps surprisingly, the privacy-utility trade-off is essentially the best one could hope for, independent of the number of repetitions in CountSketch:The error is almost identical to the error from non-private CountSketch plus the noise needed to make the vector private in the original, high-dimensional domain.

AB - Sketching is an important tool for dealing with high-dimensional vectors that are sparse (or well-approximated by a sparse vector), especially useful in distributed, parallel, and streaming settings.It is known that sketches can be made differentially private by adding noise according to the sensitivity of the sketch, and this has been used in private analytics and federated learning settings.The post-processing property of differential privacy implies that \emph{all} estimates computed from the sketch can be released within the given privacy budget.In this paper we consider the classical CountSketch, made differentially private with the Gaussian mechanism, and give an improved analysis of its estimation error.Perhaps surprisingly, the privacy-utility trade-off is essentially the best one could hope for, independent of the number of repetitions in CountSketch:The error is almost identical to the error from non-private CountSketch plus the noise needed to make the vector private in the original, high-dimensional domain.

M3 - Article in proceedings

T3 - Advances in Neural Information Processing Systems

BT - Advances in Neural Information Processing Systems 35 (NeurIPS 2022)

PB - NeurIPS Proceedings

Y2 - 28 November 2022 through 9 December 2022

ER -

ID: 340885528