arXiv Analytics

Sign in

arXiv:1901.11311 [cs.LG]AbstractReferencesReviewsResources

New Tricks for Estimating Gradients of Expectations

Christian J. Walder, Richard Nock, Cheng Soon Ong, Masashi Sugiyama

Published 2019-01-31Version 1

We derive a family of Monte Carlo estimators for gradients of expectations of univariate distributions, which is related to the log-derivative trick, but involves pairwise interactions between samples. The first of these comes from either a) introducing and approximating an integral representation based on the fundamental theorem of calculus, or b) applying the reparameterisation trick to an implicit parameterisation under infinitesimal perturbation of the parameters. From the former perspective we generalise to a reproducing kernel Hilbert space representation, giving rise to locality parameter in the pairwise interactions mentioned above. The resulting estimators are unbiased and shown to offer an independent component of useful information in comparison with the log-derivative estimator. Promising analytical and numerical examples confirm the intuitions behind the new estimators.

Related articles: Most relevant | Search more
arXiv:2409.07594 [cs.LG] (Published 2024-09-11)
Automated Discovery of Pairwise Interactions from Unstructured Data
Zuheng et al.
arXiv:1907.05600 [cs.LG] (Published 2019-07-12)
Generative Modeling by Estimating Gradients of the Data Distribution
arXiv:2002.06043 [cs.LG] (Published 2020-02-14)
Estimating Gradients for Discrete Random Variables by Sampling without Replacement