arXiv Analytics

Sign in

arXiv:1312.4569 [cs.CV]AbstractReferencesReviewsResources

Dropout improves Recurrent Neural Networks for Handwriting Recognition

Vu Pham, Théodore Bluche, Christopher Kermorvant, Jérôme Louradour

Published 2013-11-05, updated 2014-03-10Version 2

Recurrent neural networks (RNNs) with Long Short-Term memory cells currently hold the best known results in unconstrained handwriting recognition. We show that their performance can be greatly improved using dropout - a recently proposed regularization method for deep architectures. While previous works showed that dropout gave superior performance in the context of convolutional networks, it had never been applied to RNNs. In our approach, dropout is carefully used in the network so that it does not affect the recurrent connections, hence the power of RNNs in modeling sequence is preserved. Extensive experiments on a broad range of handwritten databases confirm the effectiveness of dropout on deep architectures even when the network mainly consists of recurrent and shared connections.

Related articles: Most relevant | Search more
arXiv:1605.06465 [cs.CV] (Published 2016-05-20)
Swapout: Learning an ensemble of deep architectures
arXiv:1612.00891 [cs.CV] (Published 2016-12-02)
Parameter Compression of Recurrent Neural Networks and Degredation of Short-term Memory
arXiv:1704.04055 [cs.CV] (Published 2017-04-13)
Land Cover Classification via Multi-temporal Spatial Data by Recurrent Neural Networks