arXiv Analytics

Sign in

arXiv:2409.07434 [stat.ML]AbstractReferencesReviewsResources

Asymptotics of Stochastic Gradient Descent with Dropout Regularization in Linear Models

Jiaqi Li, Johannes Schmidt-Hieber, Wei Biao Wu

Published 2024-09-11Version 1

This paper proposes an asymptotic theory for online inference of the stochastic gradient descent (SGD) iterates with dropout regularization in linear regression. Specifically, we establish the geometric-moment contraction (GMC) for constant step-size SGD dropout iterates to show the existence of a unique stationary distribution of the dropout recursive function. By the GMC property, we provide quenched central limit theorems (CLT) for the difference between dropout and $\ell^2$-regularized iterates, regardless of initialization. The CLT for the difference between the Ruppert-Polyak averaged SGD (ASGD) with dropout and $\ell^2$-regularized iterates is also presented. Based on these asymptotic normality results, we further introduce an online estimator for the long-run covariance matrix of ASGD dropout to facilitate inference in a recursive manner with efficiency in computational time and memory. The numerical experiments demonstrate that for sufficiently large samples, the proposed confidence intervals for ASGD with dropout nearly achieve the nominal coverage probability.

Related articles: Most relevant | Search more
arXiv:2006.10840 [stat.ML] (Published 2020-06-18)
Stochastic Gradient Descent in Hilbert Scales: Smoothness, Preconditioning and Earlier Stopping
arXiv:1911.01483 [stat.ML] (Published 2019-11-04)
Statistical Inference for Model Parameters in Stochastic Gradient Descent via Batch Means
arXiv:1710.06382 [stat.ML] (Published 2017-10-17)
Convergence diagnostics for stochastic gradient descent with constant step size