arXiv Analytics

Sign in

arXiv:1505.07634 [cs.LG]AbstractReferencesReviewsResources

Learning with Symmetric Label Noise: The Importance of Being Unhinged

Brendan van Rooyen, Aditya Krishna Menon, Robert C. Williamson

Published 2015-05-28Version 1

Convex potential minimisation is the de facto approach to binary classification. However, Long and Servedio [2010] proved that under symmetric label noise (SLN), minimisation of any convex potential over a linear function class can result in classification performance equivalent to random guessing. This ostensibly shows that convex losses are not SLN-robust. In this paper, we propose a convex, classification-calibrated loss and prove that it is SLN-robust. The loss avoids the Long and Servedio [2010] result by virtue of being negatively unbounded. The loss is a modification of the hinge loss, where one does not clamp at zero; hence, we call it the unhinged loss. We show that the optimal unhinged solution is equivalent to that of a strongly regularised SVM, and is the limiting solution for any convex potential; this implies that strong l2 regularisation makes most standard learners SLN-robust. Experiments confirm the SLN-robustness of the unhinged loss.

Related articles: Most relevant | Search more
arXiv:2007.06324 [cs.LG] (Published 2020-07-13)
TrustNet: Learning from Trusted Data Against (A)symmetric Label Noise
arXiv:1901.02271 [cs.LG] (Published 2019-01-08)
Cost Sensitive Learning in the Presence of Symmetric Label Noise
arXiv:1909.09868 [cs.LG] (Published 2019-09-21)
On the Importance of Delexicalization for Fact Verification