arXiv Analytics

Sign in

arXiv:1703.00144 [cs.LG]AbstractReferencesReviewsResources

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

Liang Zhao, Siyu Liao, Yanzhi Wang, Jian Tang, Bo Yuan

Published 2017-03-01Version 1

Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. We formally study LDR matrices in deep learning. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.

Related articles: Most relevant | Search more
arXiv:1810.02309 [cs.LG] (Published 2018-10-04)
Learning Compressed Transforms with Low Displacement Rank
arXiv:1206.4639 [cs.LG] (Published 2012-06-18)
Adaptive Regularization for Weight Matrices
arXiv:1807.03165 [cs.LG] (Published 2018-07-06)
Sparse Deep Neural Network Exact Solutions