Stochastic Systems Group
Home Research Group Members Programs  
Demos Calendar Publications Mission Statement Alumni

SSG Seminar Abstract


Variational inference for Dirichlet process mixtures

David Blei
Princeton University


Dirichlet process (DP) mixture models are the cornerstone of nonparametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of nonparametric Bayesian methods to a variety of practical data analysis problems (Escobar and West, 1995). However, MCMC sampling can be prohibitively slow, and it is important to explore alternatives. One class of alternatives is provided by variational methods (Jordan et al., 1999), a class of deterministic algorithms that convert inference problems into optimization problems. Thus far, variational methods have mainly been explored in the parametric setting, in particular within the formalism of the exponential family. In this talk, I will present a variational inference algorithm for DP mixtures. I present simulations that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and an application to a large-scale image analysis problem.

This is joint work with Michael Jordan.



Problems with this site should be emailed to jonesb@mit.edu