arXiv:2205.13574 [cs.LG]AbstractReferencesReviewsResources
Pruning has a disparate impact on model accuracy
Cuong Tran, Ferdinando Fioretto, Jung-Eun Kim, Rakshit Naidu
Published 2022-05-26Version 1
Network pruning is a widely-used compression technique that is able to significantly scale down overparameterized models with minimal loss of accuracy. This paper shows that pruning may create or exacerbate disparate impacts. The paper sheds light on the factors to cause such disparities, suggesting differences in gradient norms and distance to decision boundary across groups to be responsible for this critical issue. It analyzes these factors in detail, providing both theoretical and empirical support, and proposes a simple, yet effective, solution that mitigates the disparate impacts caused by pruning.
Categories: cs.LG
Related articles: Most relevant | Search more
arXiv:2304.13933 [cs.LG] (Published 2023-04-27)
Oversampling Higher-Performing Minorities During Machine Learning Model Training Reduces Adverse Impact Slightly but Also Reduces Model Accuracy
arXiv:1908.02802 [cs.LG] (Published 2019-08-07)
Investigating Decision Boundaries of Trained Neural Networks
arXiv:1805.05532 [cs.LG] (Published 2018-05-15)
Improving Knowledge Distillation with Supporting Adversarial Samples