Factor graph fragmentization of expectation propagation
- PDF / 1,234,400 Bytes
- 35 Pages / 439.37 x 666.142 pts Page_size
- 74 Downloads / 176 Views
Online ISSN 2005-2863 Print ISSN 1226-3192
RESEARCH ARTICLE
Factor graph fragmentization of expectation propagation Wilson Y. Chen1 · Matt P. Wand1 Received: 15 July 2019 / Accepted: 23 October 2019 © Korean Statistical Society 2020
Abstract Expectation propagation is a general approach to fast approximate inference for graphical models. The existing literature treats models separately when it comes to deriving and coding expectation propagation inference algorithms. This comes at the cost of similar, long-winded algebraic steps being repeated and slowing down algorithmic development. We demonstrate how factor graph fragmentization can overcome this impediment. This involves adoption of the message passing on a factor graph approach to expectation propagation and identification of factor graph sub-graphs, which we call fragments, that are common to wide classes of models. Key fragments and their corresponding messages are catalogued which means that their algebra does not need to be repeated. This allows compartmentalization of coding and efficient software development. Keywords Approximate Bayesian inference · Generalized linear mixed models · Graphical models · Kullback–Leibler projection · Message passing
1 Introduction Expectation propagation (e.g. Minka 2005) is gaining popularity as a general approach to fitting and inference for large graphical models, including those that arise in statistical contexts such as Bayesian generalized linear mixed models (e.g. Gelman et al. 2014; Kim and Wand 2018). Compared with Markov chain Monte Carlo approaches, expectation propagation has the attractions of speed and parallelizability of the computing across multiple processors making it more amenable to high volume/velocity data
Electronic supplementary material The online version of this article (https://doi.org/10.1007/s42952019-00033-9) contains supplementary material, which is available to authorized users.
B
Matt P. Wand [email protected] Wilson Y. Chen [email protected]
1
University of Technology Sydney, Ultimo, Australia
123
Journal of the Korean Statistical Society
applications. One price to be paid is inferential accuracy since expectation propagation uses product density simplifications of joint posterior density functions. Another is algebraic overhead: as demonstrated by Kim and Wand (2016) several pages of algebra are required to derive explicit programmable expectation propagation algorithms for even very simple Bayesian models. This article alleviates the latter cost. Using the notions of message passing and factor graph fragments we demonstrate the compartmentalization of expectation propagation algebra and coding. The resultant infrastructure and updating formulae lead to much more efficient expectation propagation fitting and inference and allows extension to arbitrarily large Bayesian models. Expectation propagation and mean field variational Bayes are the two most common paradigms for obtaining fast approximate inference algorithms for graphical models (e.g. Bishop 2006; Wainwright and Jo
Data Loading...