arXiv Analytics

Sign in

arXiv:1106.0730 [stat.ML]AbstractReferencesReviewsResources

Rademacher complexity of stationary sequences

Daniel J. McDonald, Cosma Rohilla Shalizi

Published 2011-06-03, updated 2017-05-22Version 2

We show how to control the generalization error of time series models wherein past values of the outcome are used to predict future values. The results are based on a generalization of standard i.i.d. concentration inequalities to dependent data without the mixing assumptions common in the time series setting. Our proof and the result are simpler than previous analyses with dependent data or stochastic adversaries which use sequential Rademacher complexities rather than the expected Rademacher complexity for i.i.d. processes. We also derive empirical Rademacher results without mixing assumptions resulting in fully calculable upper bounds.

Related articles: Most relevant | Search more
arXiv:1806.05161 [stat.ML] (Published 2018-06-13)
Overfitting or perfect fitting? Risk bounds for classification and regression rules that interpolate
arXiv:0902.1733 [stat.ML] (Published 2009-02-10, updated 2010-07-04)
Risk bounds in linear regression through PAC-Bayesian truncation
arXiv:0902.3130 [stat.ML] (Published 2009-02-18, updated 2012-03-01)
Risk Bounds for CART Classifiers under a Margin Condition