Distributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions

Message passing algorithms on graphical models offer a low-complexity and distributed paradigm for performing marginalization from a high-dimensional distribution. However, the convergence behaviors of message passing algorithms can be heavily affected by the adopted message update schedule. In this paper, we focus on the variational message passing (VMP) applied to Gaussian graphical models and its convergence under different schedules is analyzed. In particular, based on the update equations of VMP under the mean-field assumption, we prove that the mean vectors obtained from VMP are the exact marginal mean vectors under any valid message passing schedule, giving the legitimacy of using VMP in Gaussian graphical models. Furthermore, three categories of valid message passing schedules, namely serial schedule, parallel schedule and randomized schedule are considered for VMP update. In the basic serial schedule, VMP unconditionally converges, but could be slow in large-scale distributed networks. To speed up the serial schedule, a group serial schedule is proposed while guaranteeing the VMP convergence. On the other hand, parallel schedule and its damped variant are applied to accelerate VMP, where the necessary and sufficient convergence conditions are derived. To allow nodes with different local computation resources to compute messages more flexibly and efficiently, a randomized schedule is proposed for VMP update, and the probabilistic necessary and sufficient convergence c...
Source: IEEE Transactions on Signal Processing - Category: Biomedical Engineering Source Type: research