arXiv Analytics

Sign in

arXiv:2206.05454 [cs.LG]AbstractReferencesReviewsResources

A General framework for PAC-Bayes Bounds for Meta-Learning

Arezou Rezazadeh

Published 2022-06-11Version 1

Meta learning automatically infers an inductive bias, that includes the hyperparameter of the base-learning algorithm, by observing data from a finite number of related tasks. This paper studies PAC-Bayes bounds on meta generalization gap. The meta-generalization gap comprises two sources of generalization gaps: the environment-level and task-level gaps resulting from observation of a finite number of tasks and data samples per task, respectively. In this paper, by upper bounding arbitrary convex functions, which link the expected and empirical losses at the environment and also per-task levels, we obtain new PAC-Bayes bounds. Using these bounds, we develop new PAC-Bayes meta-learning algorithms. Numerical examples demonstrate the merits of the proposed novel bounds and algorithm in comparison to prior PAC-Bayes bounds for meta-learning.

Related articles: Most relevant | Search more
arXiv:1808.10406 [cs.LG] (Published 2018-08-30)
Towards Reproducible Empirical Research in Meta-Learning
arXiv:2409.05072 [cs.LG] (Published 2024-09-08)
A General Framework for Clustering and Distribution Matching with Bandit Feedback
arXiv:1810.02334 [cs.LG] (Published 2018-10-04)
Unsupervised Learning via Meta-Learning