arXiv Analytics

Sign in

arXiv:1605.03498 [cs.CV]AbstractReferencesReviewsResources

Deep Neural Networks Under Stress

Micael Carvalho, Matthieu Cord, Sandra Avila, Nicolas Thome, Eduardo Valle

Published 2016-05-11Version 1

In recent years, deep architectures have been used for transfer learning with state-of-the-art performance in many datasets. The properties of their features remain, however, largely unstudied under the transfer perspective. In this work, we present an extensive analysis of the resiliency of feature vectors extracted from deep models, with special focus on the trade-off between performance and compression rate. By introducing perturbations to image descriptions extracted from a deep convolutional neural network, we change their precision and number of dimensions, measuring how it affects the final score. We show that deep features are more robust to these disturbances when compared to classical approaches, achieving a compression rate of 98.4%, while losing only 0.88% of their original score for Pascal VOC 2007.

Comments: This article corresponds to the accepted version at IEEE ICIP 2016. We will link the DOI as soon as it is available
Categories: cs.CV, cs.AI
Related articles: Most relevant | Search more
arXiv:1609.09018 [cs.CV] (Published 2016-09-28)
Deep Architectures for Face Attributes
arXiv:1703.03372 [cs.CV] (Published 2017-03-09)
LesionSeg: Semantic segmentation of skin lesions using Deep Convolutional Neural Network
arXiv:1611.07233 [cs.CV] (Published 2016-11-22)
CAS-CNN: A Deep Convolutional Neural Network for Image Compression Artifact Suppression