Adaptive importance sampling message passing

Abstract

The aim of Probabilistic Programming (PP) is to automate inference in probabilistic models. One efficient realization of PP-based inference concerns variational message passing-based (VMP) inference in a factor graph. VMP is efficient but in principle only leads to closed-form update rules in case the model consists of conjugate and/or conditionally conjugate factor pairs. Recently, Extended Variational Message Passing (EVMP) has been proposed to broaden the applicability of VMP by importance sampling-based particle methods for non-linear and non-conjugate factor pairs. EVMP automates the importance sampling procedure by employing forward messages as proposal distributions, which unfortunately may lead to inaccurate estimation results and numerical instabilities in case the forward message is not a good representative of the unknown correct posterior. This paper addresses this issue by integrating an adaptive importance sampling procedure with message passing-based inference. The resulting method is a hyperparameter-free approximate inference engine that combines recent advances in stochastic adaptive importance sampling and optimization methods. We provide an implementation for the proposed method in the Julia package ForneyLab.jl.

Publication
IEEE International Symposium on Information Theory
Semih Akbayrak
Semih Akbayrak
Former PhD student

Former researcher at BIASlab.

İsmail Şenöz
İsmail Şenöz
Chief Scientist
LazyDynamics

Ismail Senoz is a co-founder & chief scientist of Lazy Dynamics

Bert de Vries
Bert de Vries
Professor

I am a professor at TU Eindhoven and team leader of BIASlab.