arXiv Analytics

Sign in

arXiv:2403.03375 [cs.LG]AbstractReferencesReviewsResources

Complexity Matters: Dynamics of Feature Learning in the Presence of Spurious Correlations

GuanWen Qiu, Da Kuang, Surbhi Goel

Published 2024-03-05, updated 2024-06-16Version 2

Existing research often posits spurious features as easier to learn than core features in neural network optimization, but the impact of their relative simplicity remains under-explored. Moreover, studies mainly focus on end performance rather than the learning dynamics of feature learning. In this paper, we propose a theoretical framework and an associated synthetic dataset grounded in boolean function analysis. This setup allows for fine-grained control over the relative complexity (compared to core features) and correlation strength (with respect to the label) of spurious features to study the dynamics of feature learning under spurious correlations. Our findings uncover several interesting phenomena: (1) stronger spurious correlations or simpler spurious features slow down the learning rate of the core features, (2) two distinct subnetworks are formed to learn core and spurious features separately, (3) learning phases of spurious and core features are not always separable, (4) spurious features are not forgotten even after core features are fully learned. We demonstrate that our findings justify the success of retraining the last layer to remove spurious correlation and also identifies limitations of popular debiasing algorithms that exploit early learning of spurious features. We support our empirical findings with theoretical analyses for the case of learning XOR features with a one-hidden-layer ReLU network.

Comments: Accepted to ICML 2024 with the title:"Complexity Matters: Feature Learning in the Presence of Spurious Correlations"
Categories: cs.LG
Related articles: Most relevant | Search more
arXiv:2307.08283 [cs.LG] (Published 2023-07-17)
Complexity Matters: Rethinking the Latent Space for Generative Modeling
arXiv:2403.05610 [cs.LG] (Published 2024-03-08)
Evidence, Definitions and Algorithms regarding the Existence of Cohesive-Convergence Groups in Neural Network Optimization
arXiv:2311.04163 [cs.LG] (Published 2023-11-07)
Outliers with Opposing Signals Have an Outsized Effect on Neural Network Optimization