arXiv Analytics

Sign in

arXiv:1902.03229 [cs.LG]AbstractReferencesReviewsResources

Adaptive and Safe Bayesian Optimization in High Dimensions via One-Dimensional Subspaces

Johannes Kirschner, Mojmír Mutný, Nicole Hiller, Rasmus Ischebeck, Andreas Krause

Published 2019-02-08Version 1

Bayesian optimization is known to be difficult to scale to high dimensions, because the acquisition step requires solving a non-convex optimization problem in the same search space. In order to scale the method and keep its benefits, we propose an algorithm (LineBO) that restricts the problem to a sequence of iteratively chosen one-dimensional sub-problems. We show that our algorithm converges globally and obtains a fast local rate when the function is strongly convex. Further, if the objective has an invariant subspace, our method automatically adapts to the effective dimension without changing the algorithm. Our method scales well to high dimensions and makes use of a global Gaussian process model. When combined with the SafeOpt algorithm to solve the sub-problems, we obtain the first safe Bayesian optimization algorithm with theoretical guarantees applicable in high-dimensional settings. We evaluate our method on multiple synthetic benchmarks, where we obtain competitive performance. Further, we deploy our algorithm to optimize the beam intensity of the Swiss Free Electron Laser with up to 40 parameters while satisfying safe operation constraints.

Related articles: Most relevant | Search more
arXiv:1703.00893 [cs.LG] (Published 2017-03-02)
Being Robust (in High Dimensions) Can Be Practical
arXiv:1206.6477 [cs.LG] (Published 2012-06-27)
Discovering Support and Affiliated Features from Very High Dimensions
arXiv:1409.2802 [cs.LG] (Published 2014-09-09)
Far-Field Compression for Fast Kernel Summation Methods in High Dimensions