arXiv Analytics

Sign in

arXiv:2007.04785 [cs.LG]AbstractReferencesReviewsResources

Neural Architecture Search with GBDT

Renqian Luo, Xu Tan, Rui Wang, Tao Qin, Enhong Chen, Tie-Yan Liu

Published 2020-07-09Version 1

Neural architecture search (NAS) with an accuracy predictor that predicts the accuracy of candidate architectures has drawn increasing interests due to its simplicity and effectiveness. Previous works employ neural network based predictors which unfortunately cannot well exploit the tabular data representations of network architectures. As decision tree-based models can better handle tabular data, in this paper, we propose to leverage gradient boosting decision tree (GBDT) as the predictor for NAS and demonstrate that it can improve the prediction accuracy and help to find better architectures than neural network based predictors. Moreover, considering that a better and compact search space can ease the search process, we propose to prune the search space gradually according to important features derived from GBDT using an interpreting tool named SHAP. In this way, NAS can be performed by first pruning the search space (using GBDT as a pruner) and then searching a neural architecture (using GBDT as a predictor), which is more efficient and effective. Experiments on NASBench-101 and ImageNet demonstrate the effectiveness of GBDT for NAS: (1) NAS with GBDT predictor finds top-10 architecture (among all the architectures in the search space) with $0.18\%$ test regret on NASBench-101, and achieves $24.2\%$ top-1 error rate on ImageNet; and (2) GBDT based search space pruning and neural architecture search further achieves $23.5\%$ top-1 error rate on ImageNet.

Comments: Code is available at https://github.com/renqianluo/GBDT-NAS
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1909.03615 [cs.LG] (Published 2019-09-09)
Neural Architecture Search in Embedding Space
arXiv:1811.07579 [cs.LG] (Published 2018-11-19, updated 2019-09-05)
Deep Active Learning with a Neural Architecture Search
arXiv:2008.08476 [cs.LG] (Published 2020-08-19)
NASCaps: A Framework for Neural Architecture Search to Optimize the Accuracy and Hardware Efficiency of Convolutional Capsule Networks