arXiv Analytics

Sign in

arXiv:2001.03040 [cs.LG]AbstractReferencesReviewsResources

Deep Network Approximation for Smooth Functions

Jianfeng Lu, Zuowei Shen, Haizhao Yang, Shijun Zhang

Published 2020-01-09Version 1

This paper establishes optimal approximation error characterization of deep ReLU networks for smooth functions in terms of both width and depth simultaneously. To that end, we first prove that multivariate polynomials can be approximated by deep ReLU networks of width $\mathcal{O}(N)$ and depth $\mathcal{O}(L)$ with an approximation error $\mathcal{O}(N^{-L})$. Through local Taylor expansions and their deep ReLU network approximations, we show that deep ReLU networks of width $\mathcal{O}(N\ln N)$ and depth $\mathcal{O}(L\ln L)$ can approximate $f\in C^s([0,1]^d)$ with a nearly optimal approximation rate $\mathcal{O}(\|f\|_{C^s([0,1]^d)}N^{-2s/d}L^{-2s/d})$. Our estimate is non-asymptotic in the sense that it is valid for arbitrary width and depth specified by $N\in\mathbb{N}^+$ and $L\in\mathbb{N}^+$, respectively.

Related articles: Most relevant | Search more
arXiv:2006.12231 [cs.LG] (Published 2020-06-22)
Deep Network Approximation with Discrepancy Being Reciprocal of Width to Power of Depth
arXiv:2307.06555 [cs.LG] (Published 2023-07-13)
Deep Network Approximation: Beyond ReLU to Diverse Activation Functions
arXiv:2107.02397 [cs.LG] (Published 2021-07-06)
Deep Network Approximation With Accuracy Independent of Number of Neurons