Paper: On the Difficulty of Unbiased Alpha Divergence Minimization
By
Short description: Variational inference approximates a target distribution with a simpler one. While traditional inference minimizes the “inclusive” KL-divergence, several algorithms have recently been proposed to minimize other divergences. Experimentally, however, these algorithms often seem to fail to converge. In this paper we analyze the variance of the underlying estimators for these papers. Our results are very pessimistic: For any divergence except the traditional one, the signal-to-noise ratio of the gradient estimator decays exponentially in the dimensionality.