arXiv Analytics

Sign in

arXiv:2210.07100 [cs.LG]AbstractReferencesReviewsResources

Dissipative residual layers for unsupervised implicit parameterization of data manifolds

Viktor Reshniak

Published 2022-10-13Version 1

We propose an unsupervised technique for implicit parameterization of data manifolds. In our approach, the data is assumed to belong to a lower dimensional manifold in a higher dimensional space, and the data points are viewed as the endpoints of the trajectories originating outside the manifold. Under this assumption, the data manifold is an attractive manifold of a dynamical system to be estimated. We parameterize such a dynamical system with a residual neural network and propose a spectral localization technique to ensure it is locally attractive in the vicinity of data. We also present initialization and additional regularization of the proposed residual layers. % that we call dissipative bottlenecks. We mention the importance of the considered problem for the tasks of reinforcement learning and support our discussion with examples demonstrating the performance of the proposed layers in denoising and generative tasks.

Related articles: Most relevant | Search more
arXiv:2308.13792 [cs.LG] (Published 2023-08-26)
Out-of-distribution detection using normalizing flows on the data manifold
arXiv:2006.01272 [cs.LG] (Published 2020-06-01)
Shapley-based explainability on the data manifold
arXiv:2204.08624 [cs.LG] (Published 2022-04-19)
Topology and geometry of data manifold in deep learning