Publication

Back to overview

Kullback-Leibler Proximal Variational Inference

Type of publication Peer-reviewed
Publikationsform Proceedings (peer-reviewed)
Author Mohammad Emtiyaz Khan, Pierre Baqué, François Fleuret, Pascal Fua,
Project Tracking in the Wild
Show all

Proceedings (peer-reviewed)

Title of proceedings Proceedings of the international conference on Neural Information Processing Systems (NIPS)

Open Access

Abstract

We propose a new variational inference method based on a proximal framework that uses the Kullback-Leibler (KL) divergence as the proximal term. We make two contributions towards exploiting the geometry and structure of the variational bound. First, we propose a KL proximal-point algorithm and show its equivalence to variational inference with natural gradients (e.g., stochastic variational inference). Second, we use the proximal framework to derive efficient variational algorithms for non-conjugate models. We propose a splitting procedure to separate non-conjugate terms from conjugate ones. We linearize the non-conjugate terms to obtain subproblems that admit a closed-form solution. Overall, our approach converts inference in a non-conjugate model to subproblems that involve inference in well-known conjugate models. We show that our method is applicable to a wide variety of models and can result in computationally efficient algorithms. Applications to real-world datasets show comparable performances to existing methods.
-