arXiv Analytics

Sign in

arXiv:2008.13151 [cs.CR]AbstractReferencesReviewsResources

Data Sanitisation Protocols for the Privacy Funnel with Differential Privacy Guarantees

Milan Lopuhaä-Zwakenberg, Haochen Tong, Boris Škorić

Published 2020-08-30Version 1

In the Open Data approach, governments and other public organisations want to share their datasets with the public, for accountability and to support participation. Data must be opened in such a way that individual privacy is safeguarded. The Privacy Funnel is a mathematical approach that produces a sanitised database that does not leak private data beyond a chosen threshold. The downsides to this approach are that it does not give worst-case privacy guarantees, and that finding optimal sanitisation protocols can be computationally prohibitive. We tackle these problems by using differential privacy metrics, and by considering local protocols which operate on one entry at a time. We show that under both the Local Differential Privacy and Local Information Privacy leakage metrics, one can efficiently obtain optimal protocols. Furthermore, Local Information Privacy is both more closely aligned to the privacy requirements of the Privacy Funnel scenario, and more efficiently computable. We also consider the scenario where each user has multiple attributes, for which we define Side-channel Resistant Local Information Privacy, and we give efficient methods to find protocols satisfying this criterion while still offering good utility. Finally, we introduce Conditional Reporting, an explicit LIP protocol that can be used when the optimal protocol is infeasible to compute, and we test this protocol on real-world and synthetic data. Experiments on real-world and synthetic data confirm the validity of these methods.

Comments: This preprint is an extended version of arXiv:2002.01501 (Fourteenth International Conference on the Digital Society, 2020)
Categories: cs.CR, cs.DB, cs.IT, math.IT
Related articles:
arXiv:1612.02298 [cs.CR] (Published 2016-12-07)
Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees
arXiv:2002.01501 [cs.CR] (Published 2020-02-04)
The Privacy Funnel from the viewpoint of Local Differential Privacy
arXiv:2001.04958 [cs.CR] (Published 2020-01-14)
Differentially Private and Fair Classification via Calibrated Functional Mechanism