Reparameterization Gradient Message Passing

Abstract

In this paper we consider efficient message passing based inference in a factor graph representation of a probabilistic model. Current message passing methods, such as belief propagation, variational message passing or expectation propagation, rely on analytically pre-computed message update rules. In practical models, it is often not feasible to analytically derive all update rules for all factors in the graph and as a result, efficient message passing-based inference cannot proceed. In related research on (non-message passing-based) inference, a “reparameterization trick” has lead to a considerable extension of the class of models for which automated inference is possible. In this paper, we introduce Reparameterization Gradient Message Passing (RGMP), which is a new message passing method based on the reparameterization gradient. In most models, the large majority of messages can be analytically derived and we resort to RGMP only when necessary. We will argue that this kind of hybrid message passing leads naturally to low-variance gradients.

Publication
27th European Signal Processing Conference
Date
Links