SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method
SHPOS: A Theoretical Guaranteed Accelerated Particle Optimization Sampling Method
Zhijian Li, Chao Zhang, Hui Qian, Xin Du, Lingwei Peng
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
Main Track. Pages 2701-2707.
https://doi.org/10.24963/ijcai.2021/372
Recently, the Stochastic Particle Optimization Sampling (SPOS) method is proposed to solve the particle-collapsing pitfall of deterministic Particle Variational Inference methods by ultilizing the stochastic Overdamped Langevin dynamics to enhance exploration. In this paper, we propose an accelerated particle optimization sampling method called Stochastic Hamiltonian Particle Optimization Sampling (SHPOS). Compared to the first-order dynamics used in SPOS, SHPOS adopts an augmented second-order dynamics, which involves an extra momentum term to achieve acceleration. We establish a non-asymptotic convergence analysis for SHPOS, and show that it enjoys a faster convergence rate than SPOS. Besides, we also propose a variance-reduced stochastic gradient variant of SHPOS for tasks with large-scale datasets and complex models. Experiments on both synthetic and real data validate our theory and demonstrate the superiority of SHPOS over the state-of-the-art.
Keywords:
Machine Learning: Bayesian Learning
Uncertainty in AI: Approximate Probabilistic Inference
Uncertainty in AI: Exact Probabilistic Inference