arXiv Analytics

Sign in

arXiv:2003.03397 [cs.LG]AbstractReferencesReviewsResources

Dropout: Explicit Forms and Capacity Control

Raman Arora, Peter Bartlett, Poorya Mianjy, Nathan Srebro

Published 2020-03-06Version 1

We investigate the capacity control provided by dropout in various machine learning problems. First, we study dropout for matrix completion, where it induces a data-dependent regularizer that, in expectation, equals the weighted trace-norm of the product of the factors. In deep learning, we show that the data-dependent regularizer due to dropout directly controls the Rademacher complexity of the underlying class of deep neural networks. These developments enable us to give concrete generalization error bounds for the dropout algorithm in both matrix completion as well as training deep neural networks. We evaluate our theoretical findings on real-world datasets, including MovieLens, MNIST, and Fashion-MNIST.

Related articles: Most relevant | Search more
arXiv:0901.3150 [cs.LG] (Published 2009-01-20, updated 2009-09-17)
Matrix Completion from a Few Entries
arXiv:1904.08540 [cs.LG] (Published 2019-04-17)
Matrix Completion With Selective Sampling
arXiv:1903.00702 [cs.LG] (Published 2019-03-02)
Matrix Completion via Nonconvex Regularization: Convergence of the Proximal Gradient Algorithm