arXiv Analytics

Sign in

arXiv:1810.03370 [cs.LG]AbstractReferencesReviewsResources

Empirical Bounds on Linear Regions of Deep Rectifier Networks

Thiago Serra, Srikumar Ramalingam

Published 2018-10-08Version 1

One form of characterizing the expressiveness of a piecewise linear neural network is by the number of linear regions, or pieces, of the function modeled. We have observed substantial progress in this topic through lower and upper bounds on the maximum number of linear regions and a counting procedure. However, these bounds only account for the dimensions of the network and the exact counting may take a prohibitive amount of time, therefore making it infeasible to benchmark the expressiveness of networks. In this work, we approximate the number of linear regions of specific rectifier networks with an algorithm for probabilistic lower bounds of mixed-integer linear sets. In addition, we present a tighter upper bound that leverages network coefficients. We test both on trained networks. The algorithm for probabilistic lower bounds is several orders of magnitude faster than exact counting and the values reach similar orders of magnitude, hence making our approach a viable method to compare the expressiveness of such networks. The refined upper bound is particularly stronger on networks with narrow layers.

Related articles: Most relevant | Search more
arXiv:2206.00228 [cs.LG] (Published 2022-06-01)
Lower and Upper Bounds for Numbers of Linear Regions of Graph Convolutional Networks
arXiv:1708.07244 [cs.LG] (Published 2017-08-24)
On the Compressive Power of Deep Rectifier Networks for High Resolution Representation of Class Boundaries
arXiv:1703.10355 [cs.LG] (Published 2017-03-30)
From Deep to Shallow: Transformations of Deep Rectifier Networks