arXiv Analytics

Sign in

arXiv:2105.14625 [cs.LG]AbstractReferencesReviewsResources

Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT

Thomas Bartz-Beielstein

Published 2021-05-30Version 1

A surrogate model based hyperparameter tuning approach for deep learning is presented. This article demonstrates how the architecture-level parameters (hyperparameters) of deep learning models that were implemented in Keras/tensorflow can be optimized. The implementation of the tuning procedure is 100 % based on R, the software environment for statistical computing. With a few lines of code, existing R packages (tfruns and SPOT) can be combined to perform hyperparameter tuning. An elementary hyperparameter tuning task (neural network and the MNIST data) is used to exemplify this approach.

Related articles: Most relevant | Search more
arXiv:2202.09275 [cs.LG] (Published 2022-02-18)
Rethinking Pareto Frontier for Performance Evaluation of Deep Neural Networks
arXiv:2010.03207 [cs.LG] (Published 2020-10-07)
Deep learning models for predictive maintenance: a survey, comparison, challenges and prospect
arXiv:1510.04781 [cs.LG] (Published 2015-10-16)
A Survey: Time Travel in Deep Learning Space: An Introduction to Deep Learning Models and How Deep Learning Models Evolved from the Initial Ideas