arXiv Analytics

Sign in

arXiv:2311.13624 [cs.LG]AbstractReferencesReviewsResources

A Theoretical Insight into Attack and Defense of Gradient Leakage in Transformer

Chenyang Li, Zhao Song, Weixin Wang, Chiwun Yang

Published 2023-11-22Version 1

The Deep Leakage from Gradient (DLG) attack has emerged as a prevalent and highly effective method for extracting sensitive training data by inspecting exchanged gradients. This approach poses a substantial threat to the privacy of individuals and organizations alike. This research presents a comprehensive analysis of the gradient leakage method when applied specifically to transformer-based models. Through meticulous examination, we showcase the capability to accurately recover data solely from gradients and rigorously investigate the conditions under which gradient attacks can be executed, providing compelling evidence. Furthermore, we reevaluate the approach of introducing additional noise on gradients as a protective measure against gradient attacks. To address this, we outline a theoretical proof that analyzes the associated privacy costs within the framework of differential privacy. Additionally, we affirm the convergence of the Stochastic Gradient Descent (SGD) algorithm under perturbed gradients. The primary objective of this study is to augment the understanding of gradient leakage attack and defense strategies while actively contributing to the development of privacy-preserving techniques specifically tailored for transformer-based models. By shedding light on the vulnerabilities and countermeasures associated with gradient leakage, this research aims to foster advancements in safeguarding sensitive data and upholding privacy in the context of transformer-based models.

Related articles: Most relevant | Search more
arXiv:1802.08009 [cs.LG] (Published 2018-02-22)
Iterate averaging as regularization for stochastic gradient descent
arXiv:1906.07405 [cs.LG] (Published 2019-06-18)
The Multiplicative Noise in Stochastic Gradient Descent: Data-Dependent Regularization, Continuous and Discrete Approximation
arXiv:1905.13210 [cs.LG] (Published 2019-05-30)
Generalization Bounds of Stochastic Gradient Descent for Wide and Deep Neural Networks