## arXiv Analytics

### arXiv:1806.05159 [cs.LG]AbstractReferencesReviewsResources

#### On Tighter Generalization Bound for Deep Neural Networks: CNNs, ResNets, and Beyond

Published 2018-06-13Version 1

Our paper proposes a generalization error bound for a general family of deep neural networks based on the spectral norm of weight matrices. Through introducing a novel characterization of the Lipschitz properties of neural network family, we achieve a tighter generalization error bound for ultra-deep neural networks, whose depth is much larger than the square root of its width. Besides the general deep neural networks, our results can be applied to derive new bounds for several popular architectures, including convolutional neural networks (CNNs), residual networks (ResNets), and hyperspherical networks (SphereNets). In the regime that the depth of these architectures is dominating, our bounds allow for the choice of much larger parameter spaces of weight matrices, inducing potentially stronger expressive ability.

Related articles: Most relevant | Search more
arXiv:1206.4639 [cs.LG] (Published 2012-06-18)