Abstract:
This thesis is concerned with estimation problems in graphical models of time series. In the case of vector autoregressive (VAR) processes, the conditional independence relations between variables can be inferred by finding the zeros of the inverse spectral density matrix (ISDM). This generally involves two problems: (i) select the order of the VAR model and (ii) choose the sparsity pattern of its ISDM. We first introduce a novel information theoretic (IT) criterion for order selection, the Renormalized Maximum Likelihood (RNML). We prove that RNML criterion is strongly consistent. We also demonstrate empirically its good performance for examples of VAR which have been considered in recent literature because they possess a particular type of sparsity. As a second contribution, we introduce novel algorithms for inferring the conditional independence graph of a VAR process. We propose a new family of convex optimization algorithms that can be used to solve this problem; in our approach the high-sparsity assumption is not needed. We conduct experiments with simulated data, air pollution data and stock market data for demonstrating that our algorithms are faster and more accurate than similar methods proposed in the previous literature. Finally, we focus on latent-variable graphical models for multivariate time series. We generalize an algorithm introduced in the previous literature for finding zeros in the inverse of the covariance matrix such that to identify the sparsity pattern of the ISDM of a time series containing latent components. When applied to a given time series, the algorithm produces a set of candidate models. Various IT criteria are employed for deciding the winner. We conduct an empirical study in which the method we propose is compared with the stateof-the-art.