Lifted Hybrid Variational Inference

Lifted Hybrid Variational Inference

Yuqiao Chen, Yibo Yang, Sriraam Natarajan, Nicholas Ruozzi

Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 4237-4244. https://doi.org/10.24963/ijcai.2020/585

Lifted inference algorithms exploit model symmetry to reduce computational cost in probabilistic inference. However, most existing lifted inference algorithms operate only over discrete domains or continuous domains with restricted potential functions. We investigate two approximate lifted variational approaches that apply to domains with general hybrid potentials, and are expressive enough to capture multi-modality. We demonstrate that the proposed variational methods are highly scalable and can exploit approximate model symmetries even in the presence of a large amount of continuous evidence, outperforming existing message-passing-based approaches in a variety of settings. Additionally, we present a sufficient condition for the Bethe variational approximation to yield a non-trivial estimate over the marginal polytope.
Keywords:
Uncertainty in AI: Approximate Probabilistic Inference
Uncertainty in AI: Graphical Models
Uncertainty in AI: Statistical Relational AI