arXiv Analytics

Sign in

arXiv:1806.00370 [math.OC]AbstractReferencesReviewsResources

Nonlinear Acceleration of CNNs

Damien Scieur, Edouard Oyallon, Alexandre d'Aspremont, Francis Bach

Published 2018-06-01Version 1

The Regularized Nonlinear Acceleration (RNA) algorithm is an acceleration method capable of improving the rate of convergence of many optimization schemes such as gradient descend, SAGA or SVRG. Until now, its analysis is limited to convex problems, but empirical observations shows that RNA may be extended to wider settings. In this paper, we investigate further the benefits of RNA when applied to neural networks, in particular for the task of image recognition on CIFAR10 and ImageNet. With very few modifications of exiting frameworks, RNA improves slightly the optimization process of CNNs, after training.

Related articles: Most relevant | Search more
arXiv:2006.06234 [math.OC] (Published 2020-06-11)
Revisiting the Continuity of Rotation Representations in Neural Networks
arXiv:1903.01287 [math.OC] (Published 2019-03-04)
Safety Verification and Robustness Analysis of Neural Networks via Quadratic Constraints and Semidefinite Programming
arXiv:2002.02247 [math.OC] (Published 2020-02-06)
Almost Sure Convergence of Dropout Algorithms for Neural Networks