{ "id": "1912.03036", "version": "v1", "published": "2019-12-06T09:24:56.000Z", "updated": "2019-12-06T09:24:56.000Z", "title": "Improved PAC-Bayesian Bounds for Linear Regression", "authors": [ "Vera Shalaeva", "Alireza Fakhrizadeh Esfahani", "Pascal Germain", "Mihaly Petreczky" ], "journal": "Thirty-Fourth AAAI Conference on Artificial Intelligence, Feb 2020, New York, United States", "categories": [ "cs.LG", "stat.ML" ], "abstract": "In this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. [10]. The improvements are twofold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models.", "revisions": [ { "version": "v1", "updated": "2019-12-06T09:24:56.000Z" } ], "analyses": { "keywords": [ "linear regression", "pac-bayesian bounds", "pac-bayesian error bound", "error bound applies", "well-chosen temperature parameter" ], "tags": [ "journal article" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }