arXiv Analytics

Sign in

arXiv:2007.00939 [cs.LG]AbstractReferencesReviewsResources

BOSH: Bayesian Optimization by Sampling Hierarchically

Henry B. Moss, David S. Leslie, Paul Rayson

Published 2020-07-02Version 1

Deployments of Bayesian Optimization (BO) for functions with stochastic evaluations, such as parameter tuning via cross validation and simulation optimization, typically optimize an average of a fixed set of noisy realizations of the objective function. However, disregarding the true objective function in this manner finds a high-precision optimum of the wrong function. To solve this problem, we propose Bayesian Optimization by Sampling Hierarchically (BOSH), a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations as the optimization progresses. We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper-parameter tuning tasks.

Related articles: Most relevant | Search more
arXiv:2004.10599 [cs.LG] (Published 2020-04-22)
Bayesian Optimization with Output-Weighted Importance Sampling
arXiv:2310.15351 [cs.LG] (Published 2023-10-23)
Random Exploration in Bayesian Optimization: Order-Optimal Regret and Computational Efficiency
arXiv:1902.02416 [cs.LG] (Published 2019-02-06)
Fast Hyperparameter Tuning using Bayesian Optimization with Directional Derivatives