Upload failed. Maybe wrong permissions?

Convergence Properties of Kronecker Graphical Lasso Algorithms

Theodoros Tsiligkaridis, Alfred O. Hero III, Shuheng Zhou


This report presents a thorough convergence analysis of Kronecker graphical lasso (KGLasso) algorithms for estimating the covariance of an i.i.d. Gaussian random sample under a sparse Kronecker-product covariance model. The KGlasso model, originally called the transposable regularized covariance model by Allen {\it et al} \cite{AllenTib10}, implements a pair of penalties on each Kronecker factor to enforce sparsity in the covariance estimator. The KGlasso algorithm generalizes Glasso, introduced by Yuan and Lin \cite{YL07} and Banerjee {\it et al} \cite{ModelSel}, to estimate covariances having Kronecker product form. It also generalizes the unpenalized ML flip-flop (FF) algorithm of Werner {\it et al} \cite{EstCovMatKron} to estimation of sparse Kronecker factors. We establish high dimensional rates of convergence to the true covariance as both the number of samples and the number of variables go to infinity. Our results establish that KGlasso has significantly faster asymptotic convergence than Glasso and FF. Simulations are presented that validate the results of our analysis. For example, for a sparse covariance matrix equal to the Kronecker product of two matrices, the root mean squared error of the inverse covariance estimate using FF is 3.5 times larger than that obtainable using KGlasso.



The folder “TSP final version - KGlasso” contains the pdf/tex source and figures in the paper. The folder “KGlasso_code” contains MATLAB code for running a large sample KGLasso simulation (see driver_fixed_FF_KGL.m-Fig. 3,4 in paper). The folder “KGlasso_rate_sims” contains MATLAB code for running simulations with increasing dimension and sample size (see simFamCurves.m, simFamCurves1.m-Figs. 5,6 in paper).

Log In