## Jason K. JohnsonPhD, EECS Dept., MIT.Stochastic Systems Group (SSG), Laboratory for Information and Decision Systems (LIDS). |

I completed the PhD program in 2008 and am now a director-funded postdoctoral fellow working with Michael Chertkov at Los Alamos National Laboratory, Center for Nonlinear Studies and Theoretical Division T-4. Please see my current CNLS webpage for recent publications.

I attended Appalachian State University for two years before transferring to MIT, where I graduated S.B. Physics, 1995. During the next five years, I was a member of technical staff with Alphatech Inc., where I helped develop algorithms for multi-resolution signal and image processing, data fusion and multi-target tracking. In 2000, I entered the EECS graduate program at MIT under the direction of Alan Willsky, where I earned the S.M., 2003, and am currently working to complete the PhD program.

My research has focused on the use of information theory and convex optimization to provide principled, tractable approximation methods for solving large-scale inference and estimation problems involving graphical models, also known as Markov random fields (MRFs). In particular, Gaussian MRFs (commonly used in image processing) have played a central role in these investigations.

Here are summaries of several novel methods that I introduced:- Recursive Cavity Modeling
- Maximum Entropy Relaxation
- Lagrangian Relaxation for Intractable Graphical Models
- Walk-Sum
Analysis in Gaussian Graphical Models

TA for 6.867 Introduction to Machine Learning, Fall 2003.

- Convex relaxation methods for graphical models: Lagrangian and maximum entropy approaches, PhD thesis, MIT, August, 2008.
- Estimation of GMRFs by recursive cavity modeling, Master's thesis, MIT, March, 2003.

- Johnson, Willsky. A recursive model-reduction method for estimation in Gaussian Markov random, IEEE Transactions on Image Processing, January 2008.
- Recursive cavity modeling for estimation in Gaussian MRFs. SSG Seminar, October 2002.

- Johnson, Chandrasekaran, Willsky. Learning Markov structure by maximum entropy relaxation, 11th Inter. Conf. on AI and Stat. (AISTATS '07), San Juan, Puerto Rico, March 2007. [poster].
- Chandrasekaran, Johnson, Willsky. Maximum entropy relaxation for graphical model selection given inconsistent statistics, IEEE Statistical Signal Processing Workshop, August 2007.
- Learning graphical models by maximum entropy relaxation. SSG Seminar, October 2006.

- Johnson, Malioutov, Willsky. Lagrangian relaxation for MAP estimation in graphical models. 45th Annual Allerton Conference on Communication, Control and Computing. September 2007. [slides].
- Lagrangian relaxation methods for intractable graphical models. SSG Seminar, September 2005.

- Equivalence of Entropy Regularization and Relative-Entropy Proximal Method. Technical Note, May 2008.

- Johnson. Walk-summable Gauss-Markov random fields. Technical Report, February 2002. (Corrected, November 2005).
- Johnson, Malioutov, Willsky. Walk-sum interpretation and analysis of Gaussian
belief propagation, In
*Advances in Neural Information Processing Systems*, vol. 18, pp. 579-586, 2006. This paper was selected for a spotlight at the conference. - Malioutov, Johnson, Willsky. Walk-sums and belief propagation in Gaussian graphical models, Journal of Machine Learning Research, vol. 7, pp. 2031-2064, October 2006.
- Chandrasekaran, Johnson, Willsky. Estimation in Gaussian graphical models using tractable sub-graphs: a walk-sum analysis, To appear, IEEE Transactions on Signal Processing.
- Chandrasekaran, Johnson, Willsky. Adaptive Embedded Subgraph
Algorithms using Walk-Sum Analysis. In
*Advances in Neural Information Processing Systems*, December 2007.

- Malioutov, Johnson, Willsky. Low-rank variance estimation in large-scale
GMRF models, ICASSP 2006. We received a
*student paper award*. - Malioutov, Johnson, Willsky.
*GMRF variance approximation using spliced wavelet bases*, ICASSP 2007. - Johnson, Chaney. Recursive
composition inference for force aggregation, Proc. of the 2nd
Inter. Conf. on Information Fusion, v.2, July, 1999. We were honored
to receive Alphatech's
*Joseph G. Wohl Memorial Achievement Award*for this paper.

- Fisher Information in Gaussian Graphical Models.. Tehnical note, MIT. September, 2006.
- On Mobius Transforms and Boltzmann Machines. Technical note, MIT. August, 2006.
- Johnson, Kreidl. Exponential family graphical models: inference, learning and convexity. Project for graduate seminar on learning and system identification, MIT. May, 2005.
- Min-Max Kullback-Leibler model selection. Project for convex analysis coarse, MIT. May, 2002.
- Fan, Johnson, Malioutov.
*Nonlinear optimization in exponential family graphical models*[report,talk]. Project for non-linear optimization coarse, MIT. May, 2002. - A simulation technique for GMRFs. Technical note, MIT. April, 2001.

Last updated: December 13, 2010.

Number of Visitors (since 2/07):