arXiv Analytics

Sign in

arXiv:1711.00941 [cs.LG]AbstractReferencesReviewsResources

Deep Active Learning over the Long Tail

Yonatan Geifman, Ran El-Yaniv

Published 2017-11-02Version 1

This paper is concerned with pool-based active learning for deep neural networks. Motivated by coreset dataset compression ideas, we present a novel active learning algorithm that queries consecutive points from the pool using farthest-first traversals in the space of neural activation over a representation layer. We show consistent and overwhelming improvement in sample complexity over passive learning (random sampling) for three datasets: MNIST, CIFAR-10, and CIFAR-100. In addition, our algorithm outperforms the traditional uncertainty sampling technique (obtained using softmax activations), and we identify cases where uncertainty sampling is only slightly better than random sampling.

Related articles: Most relevant | Search more
arXiv:1509.08745 [cs.LG] (Published 2015-09-29)
Compression of Deep Neural Networks on the Fly
arXiv:1702.05659 [cs.LG] (Published 2017-02-18)
On Loss Functions for Deep Neural Networks in Classification
arXiv:2009.00236 [cs.LG] (Published 2020-08-30)
A Survey of Deep Active Learning