arXiv Analytics

Sign in

arXiv:1910.04137 [cs.CR]AbstractReferencesReviewsResources

Automated Methods for Checking Differential Privacy

Gilles Barthe, Rohit Chadha, Vishal Jagannath, A. Prasad Sistla, Mahesh Viswanathan

Published 2019-10-09Version 1

Differential privacy is a de facto standard for statistical computations over databases that contain private data. The strength of differential privacy lies in a rigorous mathematical definition that guarantees individual privacy and yet allows for accurate statistical results. Thanks to its mathematical definition, differential privacy is also a natural target for formal analysis. A broad line of work uses logical methods for proving privacy. However, these methods are not complete, and only partially automated. A recent and complementary line of work uses statistical methods for finding privacy violations. However, the methods only provide statistical guarantees (but no proofs). We propose the first decision procedure for checking the differential privacy of a non-trivial class of probabilistic computations. Our procedure takes as input a program P parametrized by a privacy budget $\epsilon$, and either proves differential privacy for all possible values of $\epsilon$ or generates a counterexample. In addition, our procedure applies both to $\epsilon$-differential privacy and $(\epsilon,\delta)$-differential privacy. Technically, the decision procedure is based on a novel and judicious encoding of the semantics of programs in our class into a decidable fragment of the first-order theory of the reals with exponentiation. We implement our procedure and use it for (dis)proving privacy bounds for many well-known examples, including randomized response, histogram, report noisy max and sparse vector.