Norm-Product Belief Propagation: Primal-Dual Message-Passing for Approximate Inference
- 18 November 2010
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 56 (12), 6294-6316
- https://doi.org/10.1109/tit.2010.2079014
Abstract
Inference problems in graphical models can be represented as a constrained optimization of a free-energy function. In this paper, we treat both forms of probabilistic inference, estimating marginal probabilities of the joint distribution and finding the most probable assignment, through a unified message-passing algorithm architecture. In particular we generalize the belief propagation (BP) algorithms of sum-product and max-product and tree-reweighted (TRW) sum and max product algorithms (TRBP) and introduce a new set of convergent algorithms based on “convex-free-energy” and linear-programming (LP) relaxation as a zero-temperature of a convex-free-energy. The main idea of this work arises from taking a general perspective on the existing BP and TRBP algorithms while observing that they all are reductions from the basic optimization formula of f +Σihi where the function f is an extended-valued, strictly convex but nonsmooth and the functions hi are extended-valued functions (not necessarily convex). We use tools from convex duality to present the “primal-dual ascent” algorithm which is an extension of the Bregman successive projection scheme and is designed to handle optimization of the general type f + Σihi. We then map the fractional-free-energy variational principle for approximate inference onto the optimization formula above and introduce the “norm-product” message-passing algorithm. Special cases of the norm-product include sum-product and max-product (BP algorithms), TRBP and NMPLP algorithms. When the fractional-free-energy is set to be convex (convex-free-energy) the norm-product is globally convergent for the estimation of marginal probabilities and for approximating the LP-relaxation. We also introduce another branch of the norm-product which arises as the “zero-temperature” of the convex-free-energy which we refer to as the “convex-max-product”. The convex-max-product is convergent (unlike max-product) and aims at solving the LP- relaxation.Keywords
Other Versions
This publication has 41 references indexed in Scilit:
- A coordinate gradient descent method for nonsmooth separable minimizationMathematical Programming, 2007
- Loop series for discrete statistical models on graphsJournal of Statistical Mechanics: Theory and Experiment, 2006
- On the Uniqueness of Loopy Belief Propagation Fixed PointsNeural Computation, 2004
- CCCP Algorithms to Minimize the Bethe and Kikuchi Free Energies: Convergent Alternatives to Belief PropagationNeural Computation, 2002
- Factor graphs and the sum-product algorithmIEEE Transactions on Information Theory, 2001
- The partial constraint satisfaction problem: Facets and lifting theoremsOperations Research Letters, 1998
- An Iterative Procedure for Obtaining $I$-Projections onto the Intersection of Convex SetsThe Annals of Probability, 1985
- An Algorithm for Restricted Least Squares RegressionJournal of the American Statistical Association, 1983
- A New Approach to Linear Filtering and Prediction ProblemsJournal of Basic Engineering, 1960
- A Theory of Cooperative PhenomenaPhysical Review B, 1951