arXiv Analytics

Sign in

arXiv:2001.04958 [cs.CR]AbstractReferencesReviewsResources

Differentially Private and Fair Classification via Calibrated Functional Mechanism

Jiahao Ding, Xinyue Zhang, Xiaohuan Li, Junyi Wang, Rong Yu, Miao Pan

Published 2020-01-14Version 1

Machine learning is increasingly becoming a powerful tool to make decisions in a wide variety of applications, such as medical diagnosis and autonomous driving. Privacy concerns related to the training data and unfair behaviors of some decisions with regard to certain attributes (e.g., sex, race) are becoming more critical. Thus, constructing a fair machine learning model while simultaneously providing privacy protection becomes a challenging problem. In this paper, we focus on the design of classification model with fairness and differential privacy guarantees by jointly combining functional mechanism and decision boundary fairness. In order to enforce $\epsilon$-differential privacy and fairness, we leverage the functional mechanism to add different amounts of Laplace noise regarding different attributes to the polynomial coefficients of the objective function in consideration of fairness constraint. We further propose an utility-enhancement scheme, called relaxed functional mechanism by adding Gaussian noise instead of Laplace noise, hence achieving $(\epsilon,\delta)$-differential privacy. Based on the relaxed functional mechanism, we can design $(\epsilon,\delta)$-differentially private and fair classification model. Moreover, our theoretical analysis and empirical results demonstrate that our two approaches achieve both fairness and differential privacy while preserving good utility and outperform the state-of-the-art algorithms.

Related articles: Most relevant | Search more
arXiv:2304.02959 [cs.CR] (Published 2023-04-06)
When approximate design for fast homomorphic computation provides differential privacy guarantees
arXiv:2312.11581 [cs.CR] (Published 2023-12-18)
Protect Your Score: Contact Tracing With Differential Privacy Guarantees
arXiv:1612.02298 [cs.CR] (Published 2016-12-07)
Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees