arXiv Analytics

Sign in

arXiv:1807.07554 [math.OC]AbstractReferencesReviewsResources

A geometric integration approach to nonsmooth, nonconvex optimisation

Erlend S. Riis, Matthias J. Ehrhardt, G. R. W. Quispel, Carola-Bibiane Schönlieb

Published 2018-07-19Version 1

The optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that is frequently encountered, for example in model parameter optimisation problems. Bilevel optimisation of parameters is a standard setting in areas such as variational regularisation problems and supervised machine learning. We present efficient and robust derivative-free methods called randomised Itoh--Abe methods. These are generalisations of the Itoh--Abe discrete gradient method, a well-known scheme from geometric integration, which has previously only been considered in the smooth setting. We demonstrate that the method and its favourable energy dissipation properties are well-defined in the nonsmooth setting. Furthermore, we prove that whenever the objective function is locally Lipschitz continuous, the iterates almost surely converge to a connected set of Clarke stationary points. We present an implementation of the methods, and apply it to various test problems. The numerical results indicate that the randomised Itoh--Abe methods are superior to state-of-the-art derivative-free optimisation methods in solving nonsmooth problems while remaining competitive in terms of efficiency.

Related articles:
arXiv:1805.06444 [math.OC] (Published 2018-05-16)
A geometric integration approach to smooth optimisation: Foundations of the discrete gradient method