arXiv Analytics

Sign in

arXiv:1805.06444 [math.OC]AbstractReferencesReviewsResources

A geometric integration approach to smooth optimisation: Foundations of the discrete gradient method

Matthias J. Ehrhardt, Erlend S. Riis, Torbjørn Ringholm, Carola-Bibiane Schönlieb

Published 2018-05-16Version 1

Discrete gradient methods, a tool from geometric integration, are optimisation schemes that inherit the energy dissipation property from continuous gradient flow. They are efficient for both convex and nonconvex problems, and by choosing different discrete gradients, one can obtain both zero- and first-order optimisation algorithms. In this paper, we present a comprehensive analysis of discrete gradient methods in optimisation, answering questions about well-posedness of the iterates, convergence rates and optimal step size selection. In particular, we prove under mild assumptions that the iterates are well-posed for all choices of time step, the first such result for discrete gradient methods. We prove that these schemes achieve $\mathcal{O}(1/k)$ and linear convergence rates for all time steps, under standard assumptions on the objective function, such as smoothness, strong convexity or the Polyak-{\L}ojasiewicz property. Furthermore, we recover the optimal rates of classical schemes such as explicit gradient descent and stochastic coordinate descent. The analysis is carried out for three discrete gradients---the Gonzalez discrete gradient, the mean value discrete gradient, and the Itoh--Abe discrete gradient---as well as for a randomised Itoh--Abe method.

Related articles: Most relevant | Search more
arXiv:1807.07554 [math.OC] (Published 2018-07-19)
A geometric integration approach to nonsmooth, nonconvex optimisation
arXiv:2102.07464 [math.OC] (Published 2021-02-15)
Foundations of Multistage Stochastic Programming
arXiv:2112.03581 [math.OC] (Published 2021-12-07, updated 2022-07-14)
Kantorovich-Rubinstein distance and barycenter for finitely supported measures: Foundations and Algorithms