|Stochastic Systems Group|
Linear state space models provide a powerful and practical framework for modeling multivariate time series. Recently, a new class of algorithms for identifying state space models from data, known as subspace methods, have been developed. We argue that subspace methods are best seen as an application of classical stochastic realization theory to the system identification problem. Stochastic realization theory deals with the design of models whose output statistics exactly (or approximately) match a given probability distribution. Conceptually, subspace methods work by first estimating the sample covariance, and then constructing a model which approximately realizes the corresponding empirical Gaussian distribution. We start by emphasizing the role of the state as an information interface between the past and future. This naturally leads to the heart of classical realization theory, namely a factorization of the Hankel matrix relating the past and future processes into its corresponding reachability and observability matrices. We will characterize the set of realizations of minimal state dimension, including its connections to the innovations representation or Kalman filter. In practical situations where data is finite and computational resources limited, it is necessary to consider approximate stochastic realizations. We will discuss two different criteria, known as cannonical correlations and predictive efficiency, for choosing the "most informative" components of the state space. We will finish with a high-level description of the way in which the tools of stochastic realization may be naturally and efficiently adapted to the system identification problem. Specifically, we will discuss the means by which subspace methods avoid the difficulties associated with classical prediction error techniques, with the caveat that they may not produce a viable model for arbitrary input data.
Problems with this site should be emailed to email@example.com