arXiv Analytics

Sign in

arXiv:1603.03657 [cs.LG]AbstractReferencesReviewsResources

Efficient forward propagation of time-sequences in convolutional neural networks using Deep Shifting

Koen Groenland, Sander Bohte

Published 2016-03-11Version 1

When a Convolutional Neural Network is used for on-the-fly evaluation of continuously updating time-sequences, many redundant convolution operations are performed. We propose the method of Deep Shifting, which remembers previously calculated results of convolution operations in order to minimize the number of calculations. The reduction in complexity is at least a constant and in the best case quadratic. We demonstrate that this method does indeed save significant computation time in a practical implementation, especially when the networks receives a large number of time-frames.

Related articles: Most relevant | Search more
arXiv:1912.05687 [cs.LG] (Published 2019-12-11)
REFINED (REpresentation of Features as Images with NEighborhood Dependencies): A novel feature representation for Convolutional Neural Networks
arXiv:1806.02012 [cs.LG] (Published 2018-06-06)
A Peek Into the Hidden Layers of a Convolutional Neural Network Through a Factorization Lens
arXiv:2105.04232 [cs.LG] (Published 2021-05-10)
De-homogenization using Convolutional Neural Networks