arXiv Analytics

Sign in

arXiv:1705.03419 [cs.CV]AbstractReferencesReviewsResources

Learning Deep Networks from Noisy Labels with Dropout Regularization

Ishan Jindal, Matthew Nokleby, Xuewen Chen

Published 2017-05-09Version 1

Large datasets often have unreliable labels-such as those obtained from Amazon's Mechanical Turk or social media platforms-and classifiers trained on mislabeled datasets often exhibit poor performance. We present a simple, effective technique for accounting for label noise when training deep neural networks. We augment a standard deep network with a softmax layer that models the label noise statistics. Then, we train the deep network and noise model jointly via end-to-end stochastic gradient descent on the (perhaps mislabeled) dataset. The augmented model is overdetermined, so in order to encourage the learning of a non-trivial noise model, we apply dropout regularization to the weights of the noise model during training. Numerical experiments on noisy versions of the CIFAR-10 and MNIST datasets show that the proposed dropout technique outperforms state-of-the-art methods.

Comments: Published at 2016 IEEE 16th International Conference on Data Mining
Categories: cs.CV, cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1806.02612 [cs.CV] (Published 2018-06-07)
Dimensionality-Driven Learning with Noisy Labels
Xingjun Ma et al.
arXiv:2007.05836 [cs.CV] (Published 2020-07-11)
Meta Soft Label Generation for Noisy Labels
arXiv:1912.02911 [cs.CV] (Published 2019-12-05)
Deep learning with noisy labels: exploring techniques and remedies in medical image analysis