Beyond Log-concavity: Provable Guarantees For Sampling Multi-modal Distributions Using Simulated Tempering Langevin Monte Carlo

Authors:
HOLDEN LEE Princeton
Andrej Risteski MIT
Rong Ge Duke University

Introduction:

A key task in Bayesian machine learning is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality).

Abstract:

A key task in Bayesian machine learning is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality). One prevalent example of this is sampling posteriors in parametric distributions, such as latent-variable generative models. However sampling (even very approximately) can be #P-hard.

You may want to know: