Journal
SIAM JOURNAL ON SCIENTIFIC COMPUTING
Volume 38, Issue 2, Pages A712-A736Publisher
SIAM PUBLICATIONS
DOI: 10.1137/15M102318X
Keywords
stochastic differential equations; adaptive thermostat; Bayesian sampling; machine learning; invariant measure; ergodicity
Categories
Funding
- University of Edinburgh
- China Scholarship Council
- EPSRC [EP/K039512/1] Funding Source: UKRI
- Engineering and Physical Sciences Research Council [EP/K039512/1] Funding Source: researchfish
Ask authors/readers for more resources
We study numerical methods for sampling probability measures in high dimension where the underlying model is only approximately identified with a gradient system. Extended stochastic dynamical methods are discussed which have application to multiscale models, nonequilibrium molecular dynamics, and Bayesian sampling techniques arising in emerging machine learning applications. In addition to providing a more comprehensive discussion of the foundations of these methods, we propose a new numerical method for the adaptive Langevin/stochastic gradient Nose-Hoover thermostat that achieves a dramatic improvement in numerical efficiency over the most popular stochastic gradient methods reported in the literature. We also demonstrate that the newly established method inherits a superconvergence property (fourth order convergence to the invariant measure for configurational quantities) recently demonstrated in the setting of Langevin dynamics. Our findings are verified by numerical experiments.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available