The Impact of Differential Privacy on Group Disparity Mitigation

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

  • Fulltext

    Forlagets udgivne version, 724 KB, PDF-dokument

  • Victor Petren Bach Hansen
  • Atula Tejaswi Neerkaje
  • Ramit Sawhney
  • Lucie Flek
  • Søgaard, Anders
The performance cost of differential privacy has, for some applications, been shown to be higher for minority groups fairness, conversely, has been shown to disproportionally compromise the privacy of members of such groups. Most work in this area has been restricted to computer vision and risk assessment. In this paper, we evaluate the impact of differential privacy on fairness across four tasks, focusing on how attempts to mitigate privacy violations and between-group performance differences interact Does privacy inhibit attempts to ensure fairness? To this end, we train epsilon, delta-differentially private models with empirical risk minimization and group distributionally robust training objectives. Consistent with previous findings, we find that differential privacy increases between-group performance differences in the baseline setting but more interestingly, differential privacy reduces between-group performance differences in the robust setting. We explain this by reinterpreting differential privacy as regularization.
OriginalsprogEngelsk
TitelProceedings of the Fourth Workshop on Privacy in Natural Language Processing
Antal sider14
ForlagAssociation for Computational Linguistics
Publikationsdato2022
DOI
StatusUdgivet - 2022
Begivenhed4th Workshop on Privacy in Natural Language Processing - Seattle, United States, Seattle, USA
Varighed: 1 jul. 20221 jul. 2022

Konference

Konference4th Workshop on Privacy in Natural Language Processing
LokationSeattle, United States
LandUSA
BySeattle
Periode01/07/202201/07/2022

ID: 341493148