arXiv Analytics

Sign in

arXiv:2106.07148 [stat.ML]AbstractReferencesReviewsResources

On the Sample Complexity of Learning with Geometric Stability

Alberto Bietti, Luca Venturi, Joan Bruna

Published 2021-06-14Version 1

Many supervised learning problems involve high-dimensional data such as images, text, or graphs. In order to make efficient use of data, it is often useful to leverage certain geometric priors in the problem at hand, such as invariance to translations, permutation subgroups, or stability to small deformations. We study the sample complexity of learning problems where the target function presents such invariance and stability properties, by considering spherical harmonic decompositions of such functions on the sphere. We provide non-parametric rates of convergence for kernel methods, and show improvements in sample complexity by a factor equal to the size of the group when using an invariant kernel over the group, compared to the corresponding non-invariant kernel. These improvements are valid when the sample size is large enough, with an asymptotic behavior that depends on spectral properties of the group. Finally, these gains are extended beyond invariance groups to also cover geometric stability to small deformations, modeled here as subsets (not necessarily subgroups) of permutations.

Related articles: Most relevant | Search more
arXiv:2409.01243 [stat.ML] (Published 2024-09-02)
Sample Complexity of the Sign-Perturbed Sums Method
arXiv:2106.07898 [stat.ML] (Published 2021-06-15)
Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral
arXiv:1011.5395 [stat.ML] (Published 2010-11-24)
The Sample Complexity of Dictionary Learning