Stochastic Systems Group
Home Research Group Members Programs  
Demos Calendar Publications Mission Statement Alumni

SSG Seminar Abstract


Learning Graphical Models by Maximum Entropy Relaxation

Jason Johnson
SSG, MIT


We propose a new approach for learning a thin graphical model approximation to a specified multivariate probability distribution (e.g., the empirical distribution from sample data). Very importantly, the selection of sparse graph structure arises naturally in our approach through the solution of a convex optimization problem, which differentiates our method from standard combinatorial approaches. In our approach, we seek the maximum entropy relaxation (MER) within an exponential family, which maximizes entropy subject to constraints that marginal distributions on small subsets of variables are close to the prescribed marginals in relative entropy. We also present a primal-dual interior point solution method, which is scalable and tractable provided the level of relaxation is sufficient to obtain a thin graph. An important element of this approach is that we exploit efficient representations of the Fisher information matrix in models defined on chordal graphs. Our development focuses on two important exponential families: Gaussian models and Boltzmann machines. The merits of this approach are investigated by recovering the graphical structure of some simple graphical models from sample data.

Joint work with Venkat Chandrasekaran and Alan Willsky.



Problems with this site should be emailed to jonesb@mit.edu