Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We consider the problem of sampling from constrained distributions, which has posed significant challenges to both non-asymptotic analysis and algorithmic design. We propose a unified framework, which is inspired by the classical mirror descent, to derive novel first-order sampling schemes. We prove that, for a general target distribution with strongly convex potential, our framework implies the existence of a first-order algorithm achieving (O) over tilde (epsilon(-2)d) convergence, suggesting that the state-of-the-art (O) over tilde (epsilon(-6)d(5)) can be vastly improved. With the important Latent Dirichlet Allocation (LDA) application in mind, we specialize our algorithm to sample from Dirichlet posteriors, and derive the first non-asymptotic (O) over tilde (epsilon(-2)d(2)) rate for first-order sampling. We further extend our framework to the mini-batch setting and prove convergence rates when only stochastic gradients are available. Finally, we report promising experimental results for LDA on real datasets.
Lenka Zdeborová, Giovanni Piccioli, Emanuele Troiani
Franz-Josef Haug, Luca Massimiliano Antognini, Josua Andreas Stückelberger, Xinyu Zhang, Zhao Wang, Jie Yang