Stochastic Systems Group  

Ayres Fan  SSG, MIT
We are interested in the relatively general field of hard segmentation problems. Segmentation is the process of dividing an image into coherent regions. In some applications such as gray matter/white matter separation in brain MR, it's a relatively easy task to accomplish. But in many other applications, this can be greatly complicated due to factors such as low contrast, occlusions, missing data, poorly defined edges, and poor data models. One way to make these problems more tractable is to utilize more information in the segmentation process. Traditional segmentation algorithms rely on edge information or area statistics with a weak prior that encourages smoothness. But we use other information such as presegmented examples to better describe any prior information we may have.
We work with dense curve representations (e.g., level sets). In continuous space, this means that our shape representations are infinite dimensional and exist on manifolds embedded in some Hilbert space. We then want to be able to describe probability density functions on this manifold in local regions. We model probability on the manifold as some exponentiated distance function from training examples. Here we will use L1, L2, and Wasserstein (MongeKantorovich) distance functions. Depending on the application, one may be more appropriate than another. The Wasserstein distance is a difficult one to compute, so we introduce some methods that can help with the computation speed. Once we can describe a prior probability on the manifold, we then have a MAP estimation problem which we can solve using different techniques. We show preliminary results using expectationmaximization on a mixture model, and a Markov Chain Monte Carlo sampling method.
Problems with this site should be emailed to jonesb@mit.edu