Stochastic Systems Group
Home Research Group Members Programs  
Demos Calendar Publications Mission Statement Alumni

SSG Seminar Abstract


Approximate Inference using Conditional Entropy Decompositions

Amir Globerson
CSAIL, MIT


Graphical models are a powerful tool for representing multivariate distributions, and have been used with considerable success in domains such as channel coding and image processing. Although they yield compact representations of distributions, it is generally NP hard to infer simple model properties, such as the marginals over single variables, or the MAP assignment. This difficulty has prompted extensive research into approximate inference algorithms.

Here we introduce a novel approximate inference method that combines concepts from both variable elimination and variational approaches. The method uses a truncated entropy chain rule to obtain an upper bound on the entropy of a graphical model. The structure of the bound is determined by a permutation, or elimination order, of the model variables. The entropy bound can then be used to construct a free energy function, whose minimization yields an upper bound on the model partition function, and an approximation to the model marginals.

The resulting optimization problem is convex, and is in fact a dual of a geometric program. We show that in some cases this dual can be further simplified to yield an unconstrained convex minimization problem. In these cases, we also describe a provably convergent message-passing algorithm.

We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.

Based on joint work with Tommi Jaakkola.



Problems with this site should be emailed to jonesb@mit.edu