arXiv Analytics

Sign in

arXiv:2007.07588 [cs.LG]AbstractReferencesReviewsResources

Importance of Tuning Hyperparameters of Machine Learning Algorithms

Hilde J. P. Weerts, Andreas C. Mueller, Joaquin Vanschoren

Published 2020-07-15Version 1

The performance of many machine learning algorithms depends on their hyperparameter settings. The goal of this study is to determine whether it is important to tune a hyperparameter or whether it can be safely set to a default value. We present a methodology to determine the importance of tuning a hyperparameter based on a non-inferiority test and tuning risk: the performance loss that is incurred when a hyperparameter is not tuned, but set to a default value. Because our methods require the notion of a default parameter, we present a simple procedure that can be used to determine reasonable default parameters. We apply our methods in a benchmark study using 59 datasets from OpenML. Our results show that leaving particular hyperparameters at their default value is non-inferior to tuning these hyperparameters. In some cases, leaving the hyperparameter at its default value even outperforms tuning it using a search procedure with a limited number of iterations.

Related articles: Most relevant | Search more
arXiv:2008.13690 [cs.LG] (Published 2020-08-31)
Evaluation of machine learning algorithms for Health and Wellness applications: a tutorial
arXiv:1506.00852 [cs.LG] (Published 2015-06-02)
Peer Grading in a Course on Algorithms and Data Structures: Machine Learning Algorithms do not Improve over Simple Baselines
arXiv:1907.12363 [cs.LG] (Published 2019-07-25)
A comparison of Deep Learning performances with others machine learning algorithms on credit scoring unbalanced data