arXiv Analytics

Sign in

arXiv:1901.10801 [cs.LG]AbstractReferencesReviewsResources

Generalized Tensor Models for Recurrent Neural Networks

Valentin Khrulkov, Oleksii Hrinchuk, Ivan Oseledets

Published 2019-01-30Version 1

Recurrent Neural Networks (RNNs) are very successful at solving challenging problems with sequential data. However, this observed efficiency is not yet entirely explained by theory. It is known that a certain class of multiplicative RNNs enjoys the property of depth efficiency --- a shallow network of exponentially large width is necessary to realize the same score function as computed by such an RNN. Such networks, however, are not very often applied to real life tasks. In this work, we attempt to reduce the gap between theory and practice by extending the theoretical analysis to RNNs which employ various nonlinearities, such as Rectified Linear Unit (ReLU), and show that they also benefit from properties of universality and depth efficiency. Our theoretical results are verified by a series of extensive computational experiments.

Comments: Accepted as a conference paper at ICLR 2019
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1809.05896 [cs.LG] (Published 2018-09-16)
Classifying Process Instances Using Recurrent Neural Networks
arXiv:1902.07275 [cs.LG] (Published 2019-02-19)
Understanding and Controlling Memory in Recurrent Neural Networks
arXiv:1904.09816 [cs.LG] (Published 2019-04-22)
Adversarial Dropout for Recurrent Neural Networks