An Efficient Minibatch Acceptance Test for Metropolis-Hastings

An Efficient Minibatch Acceptance Test for Metropolis-Hastings

Daniel Seita, Xinlei Pan, Haoyu Chen, John Canny

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Best Sister Conferences. Pages 5359-5363. https://doi.org/10.24963/ijcai.2018/753

We present a novel Metropolis-Hastings method for large datasets that uses small expected-size mini-batches of data. Previous work on reducing the cost of Metropolis-Hastings tests yields only constant factor reductions versus using the full dataset for each sample. Here we present a method that can be tuned to provide arbitrarily small batch sizes, by adjusting either proposal step size or temperature. Our test uses the noise-tolerant Barker acceptance test with a novel additive correction variable. The resulting test has similar cost to a normal SGD update. Our experiments demonstrate several order-of-magnitude speedups over previous work.
Keywords:
Machine Learning: Machine Learning
Uncertainty in AI: Approximate Probabilistic Inference