Stochastic Systems Group  

Tightening Convex Relaxations for Probabilistic Inference
David Sontag
CSAIL, MIT
Linear Programming (LP) relaxations have become powerful tools for finding the most probable (MAP) assignment in graphical models. These relaxations can be solved efficiently using messagepassing algorithms such as belief propagation and, when the relaxation is tight, provably find the MAP assignment. The standard LP relaxation is not tight enough in many realworld problems, leading to the question of how to tighten the relaxation. We propose two algorithms that tighten the relaxation iteratively in a problemspecific way, at each stage using the existing solution to guide what constraints are added. We first give a cuttingplane algorithm for the primal LP, and show how to efficiently find valid constraints that are violated by the existing solution. Our second algorithm solves the constraint selection problem monotonically in the dual LP, selecting constraints that minimize an upper bound on the MAP assignment. Our algorithms find the MAP assignment in protein sidechain placement, protein design, and stereo problems, in cases where the standard LP relaxation fails.
Joint work with Talya Meltzer, Amir Globerson, Yair Weiss, and Tommi Jaakkola.
Problems with this site should be emailed to jonesb@mit.edu