ForneyLab.jl: Fast and flexible automated inference through message passing in Julia

Abstract

Probabilistic programming systems usually rely on inference methods that require no manual derivations. While these ‘derivation-free’ algorithms can perform inference in a wide range of models, their generality comes at the price of significant computational load. The message passing paradigm provides a convenient way to exploit model-specific properties semi-automatically, leading to fast and flexible algorithms. We developed ForneyLab.jl to automatically generate code that implements (approximate) Bayesian inference in a given model through message passing. Execution of an algorithm for Bayesian inference in a random walk model as derived by ForneyLab proves to be faster than black-box inference algorithms based on sampling or stochastic optimization.

Publication
ProbProg 2018
Date
Links