Part I: VI Basics & Overview

Kickstart with two most important survey papers in VI.

Variational Inference: A Reivew for Statisticians (2017)
Blei, David M., Alp Kucukelbir, and Jon D. McAuliffe

Advances in Variational Inferece (2018)
Zhang, Cheng, Judith Bütepage, Hedvig Kjellström, and Stephan Mandt

Part II: Scalable VI

Papers in this part aim to apply VI to scenarios where data size is massive and scalable inference is needed.

Stochastic Variational Inference (2013)
Hoffman, Matthew D., David M. Blei, Chong Wang, and John Paisley

An Adaptive Learning Rate for Stochastic Variational Inference (2013)
Ranganath, Rajesh, Chong Wang, Blei David, and Eric Xing

Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam (2018)
Emtiyaz Khan, Mohammad, Didrik Nielsen, Voot Tangkaratt, Wu Lin, Yarin Gal, and Akash Srivastava

Part III: Black Box VI

Let’s get rid of mean-field assumptions and perform VI on a broader range of models and problems.

Black Box Variational Inference (2014)
Ranganath, Rajesh, Sean Gerrish, and David Blei

Local Expectation Gradients for Black Box Variational Inference (2015)
AUEB, Titsias RC, and Miguel Lázaro-Gredilla

Part IV: Reparametrized VI

Here’s another way to enable general VI algorithms.

Stochastic Backpropagation and Approximate Inference in Deep Generative Models (2014)
Rezende, Danilo Jimenez, Shakir Mohamed, and Daan Wierstra

Categorical Reparametrization with Gumbel-softmax (2017)
Jang, Eric, Shixiang Gu, and Ben Poole

Quasi-Monte Carlo Variational Inference (2018)
Buchholz, Alexander, Florian Wenzel, and Stephan Mandt

Part V: TBD

May or may not include VI with other divergence measures, Structured VI (normalizing flow, hierachical VI, etc.), and amortized VI (VAE and its variants).

Variational Inference Reading List - Sida Li