arXiv Analytics

Sign in

arXiv:1903.12384 [cs.LG]AbstractReferencesReviewsResources

Deep Representation with ReLU Neural Networks

Andreas Heinecke, Wen-Liang Hwang

Published 2019-03-29Version 1

We consider deep feedforward neural networks with rectified linear units from a signal processing perspective. In this view, such representations mark the transition from using a single (data-driven) linear representation to utilizing a large collection of affine linear representations tailored to particular regions of the signal space. This paper provides a precise description of the individual affine linear representations and corresponding domain regions that the (data-driven) neural network associates to each signal of the input space. In particular, we describe atomic decompositions of the representations and, based on estimating their Lipschitz regularity, suggest some conditions that can stabilize learning independent of the network depth. Such an analysis may promote further theoretical insight from both the signal processing and machine learning communities.

Related articles: Most relevant | Search more
arXiv:2207.12545 [cs.LG] (Published 2022-07-25)
$p$-DkNN: Out-of-Distribution Detection Through Statistical Testing of Deep Representations
arXiv:2012.01780 [cs.LG] (Published 2020-12-03)
Neural Contextual Bandits with Deep Representation and Shallow Exploration
arXiv:1306.2759 [cs.LG] (Published 2013-06-12)
Horizontal and Vertical Ensemble with Deep Representation for Classification