arXiv:1811.07579 [cs.LG]AbstractReferencesReviewsResources
Deep Active Learning with a Neural Architecture Search
Published 2018-11-19, updated 2019-09-05Version 2
We consider active learning of deep neural networks. Most active learning works in this context have focused on studying effective querying mechanisms and assumed that an appropriate network architecture is a priori known for the problem at hand. We challenge this assumption and propose a novel active strategy whereby the learning algorithm searches for effective architectures on the fly, while actively learning. We apply our strategy using three known querying techniques (softmax response, MC-dropout, and coresets) and show that the proposed approach overwhelmingly outperforms active learning using fixed architectures.
Comments: Accepted to NeurIPS 2019
Related articles: Most relevant | Search more
arXiv:1711.00941 [cs.LG] (Published 2017-11-02)
Deep Active Learning over the Long Tail
arXiv:1611.05162 [cs.LG] (Published 2016-11-16)
Net-Trim: A Layer-wise Convex Pruning of Deep Neural Networks
arXiv:2009.00236 [cs.LG] (Published 2020-08-30)
A Survey of Deep Active Learning