arXiv Analytics

Sign in

arXiv:2207.07696 [cs.LG]AbstractReferencesReviewsResources

Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

Marissa Masden

Published 2022-07-15Version 1

We algorithmically determine the regions and facets of all dimensions of the canonical polyhedral complex, the universal object into which a ReLU network decomposes its input space. We show that the locations of the vertices of the canonical polyhedral complex along with their signs with respect to layer maps determine the full facet structure across all dimensions. We present an algorithm which calculates this full combinatorial structure, making use of our theorems that the dual complex to the canonical polyhedral complex is cubical and it possesses a multiplication compatible with its facet structure. The resulting algorithm is numerically stable, polynomial time in the number of intermediate neurons, and obtains accurate information across all dimensions. This permits us to obtain, for example, the true topology of the decision boundaries of networks with low-dimensional inputs. We run empirics on such networks at initialization, finding that width alone does not increase observed topology, but width in the presence of depth does. Source code for our algorithms is accessible online at https://github.com/mmasden/canonicalpoly.

Related articles: Most relevant | Search more
arXiv:2305.05562 [cs.LG] (Published 2023-05-09)
SkelEx and BoundEx: Natural Visualization of ReLU Neural Networks
arXiv:2301.07966 [cs.LG] (Published 2023-01-19)
Getting Away with More Network Pruning: From Sparsity to Geometry and Linear Regions
Junyang Cai et al.
arXiv:2305.15141 [cs.LG] (Published 2023-05-24)
From Tempered to Benign Overfitting in ReLU Neural Networks