{ "id": "1907.10599", "version": "v1", "published": "2019-07-24T17:58:45.000Z", "updated": "2019-07-24T17:58:45.000Z", "title": "A Fine-Grained Spectral Perspective on Neural Networks", "authors": [ "Greg Yang", "Hadi Salman" ], "comment": "12 pages of main text, 13 figures, 35 pages including appendix", "categories": [ "cs.LG", "cs.NE", "stat.ML" ], "abstract": "Are neural networks biased toward simple functions? Does depth always help learn more complex features? Is training the last layer of a network as good as training all layers? These questions seem unrelated at face value, but in this work we give all of them a common treatment from the spectral perspective. We will study the spectra of the *Conjugate Kernel*, CK, (also called the *Neural Network-Gaussian Process Kernel*), and the *Neural Tangent Kernel*, NTK. Roughly, the CK and the NTK tell us respectively \"what a network looks like at initialization\"and \"what a network looks like during and after training.\" Their spectra then encode valuable information about the initial distribution and the training and generalization properties of neural networks. By analyzing the eigenvalues, we lend novel insights into the questions put forth at the beginning, and we verify these insights by extensive experiments of neural networks. We believe the computational tools we develop here for analyzing the spectra of CK and NTK serve as a solid foundation for future studies of deep neural networks. We have open-sourced the code for it and for generating the plots in this paper at github.com/thegregyang/NNspectra.", "revisions": [ { "version": "v1", "updated": "2019-07-24T17:58:45.000Z" } ], "analyses": { "keywords": [ "fine-grained spectral perspective", "network looks", "deep neural networks", "lend novel insights", "complex features" ], "note": { "typesetting": "TeX", "pages": 12, "language": "en", "license": "arXiv", "status": "editable" } } }