arXiv Analytics

Sign in

arXiv:2010.02681 [stat.ML]AbstractReferencesReviewsResources

Kernel regression in high dimension: Refined analysis beyond double descent

Fanghui Liu, Zhenyu Liao, Johan A. K. Suykens

Published 2020-10-06Version 1

In this paper, we provide a precise characterize of generalization properties of high dimensional kernel ridge regression across the under- and over-parameterized regimes, depending on whether the number of training data $n$ exceeds the feature dimension $d$. By establishing a novel bias-variance decomposition of the expected excess risk, we show that, while the bias is independent of $d$ and monotonically decreases with $n$, the variance depends on $n,d$ and can be unimodal or monotonically decreasing under different regularization schemes. Our refined analysis goes beyond the double descent theory by showing that, depending on the data eigen-profile and the level of regularization, the kernel regression risk curve can be a double-descent-like, bell-shaped, or monotonic function of $n$. Experiments on synthetic and real data are conducted to support our theoretical findings.

Related articles: Most relevant | Search more
arXiv:2205.15549 [stat.ML] (Published 2022-05-31)
VC Theoretical Explanation of Double Descent
arXiv:1911.05822 [stat.ML] (Published 2019-11-13)
A Model of Double Descent for High-dimensional Binary Linear Classification
arXiv:2110.06910 [stat.ML] (Published 2021-10-13, updated 2022-05-28)
On the Double Descent of Random Features Models Trained with SGD