Achieving Differential Privacy
Contents
Achieving Differential Privacy#
In Brief#
Differential privacy guarantees can be provided by perturbation mechanisms aim at randomizing the output distributions of functions in order to provide privacy guarantees.
More in Detail#
Perturbation mechanisms aim at randomizing the output distributions of functions in order to provide privacy guarantees. We focus on the major mechanisms able to provide differential privacy guarantees (see Differential Privacy) and concentrate on the major mechanisms. The Laplace Mechanism is dedicated to perturb functions that output real values, allowing them to satisfy \(\epsilon\)-Differential Privacy. The Exponential Mechanism is a generalization of the Laplace mechanism and also allows to satisfy \(\epsilon\)-differential privacy. The éGaussian Mechanism* is a variant of the Laplace mechanism, applying to functions that output real values as well, but allowing them to satisfy the \((\epsilon,\delta)\)-differential privacy relaxation. These three mechanisms are appropriate for perturbing centralized functions that input a full dataset (e.g., a sum query). They are often used as basic building blocks, combined or not, for perturbing elaborate functions. Finally, Randomized Response Mechanisms input and output one single row at a time (represented as a vector of bits): they are local mechanisms and can be applied prior to the data collection.
History#
The earliest known randomized response mechanism has been proposed in the 1960’s (decades before differential privacy) by Warner, a sociologist who wanted to improve the reliability of responses to sensitive questions by letting the interviewee perturb his/her answer in a controlled manner. Thanks to the simplicity of their implementation and to the differential privacy guarantees that they provide, they generate a renewed interest both from the academia and from the industry.
The Laplace mechanism (resp. the Gaussian mechanism) has been proposed jointly with the seminal \(\epsilon\)-differential privacy model (resp. the \((\epsilon, \delta)\)-differential privacy model). Its generalization as the Exponential mechanism was proposed the following year.
Bibliography#
The early randomized response mechanism was proposed in [3], the Laplace mechanism in [2] and the Exponential mechanism in [4]. The Gaussian mechanism was shown to satisfy \((\epsilon, \delta)\)-differential privacy in [1]. Finally, [5] provides an overview and an evaluation of various randomized response mechanisms proposed before 2021.
- 1
Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam D. Smith. Calibrating noise to sensitivity in private data analysis. In TCC. 2006.
- 2
Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. Our data, ourselves: privacy via distributed noise generation. In EUROCRYPT. 2006.
- 3
Stanley L. Warner. Randomized response: a survey technique for eliminating evasive answer bias. Journal of the American Statistical Association, 60 309:63–6, 1965.
- 4
Frank McSherry and Kunal Talwar. Mechanism design via differential privacy. 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07), pages 94–103, 2007.
- 5
Graham Cormode, Samuel Maddock, and Carsten Maple. Frequency estimation under local differential privacy. Proc. VLDB Endow., 14(11):2046–2058, 2021.
This entry was written by Tristan Allard.