arXiv Analytics

Sign in

arXiv:2306.11827 [cs.LG]AbstractReferencesReviewsResources

Any Deep ReLU Network is Shallow

Mattia Jacopo Villani, Nandi Schoots

Published 2023-06-20Version 1

We constructively prove that every deep ReLU network can be rewritten as a functionally identical three-layer network with weights valued in the extended reals. Based on this proof, we provide an algorithm that, given a deep ReLU network, finds the explicit weights of the corresponding shallow network. The resulting shallow network is transparent and used to generate explanations of the model s behaviour.

Related articles: Most relevant | Search more
arXiv:2011.04041 [cs.LG] (Published 2020-11-08)
Unwrapping The Black Box of Deep ReLU Networks: Interpretability, Diagnostics, and Simplification
arXiv:1910.08581 [cs.LG] (Published 2019-10-18)
Towards Quantifying Intrinsic Generalization of Deep ReLU Networks
arXiv:2001.03040 [cs.LG] (Published 2020-01-09)
Deep Network Approximation for Smooth Functions