Upload failed. Maybe wrong permissions?

Shrinkage estimation of high dimensional covariance matrices


We address covariance estimation under mean-squared loss in the Gaussian setting. Specifically, we consider shrinkage methods which are suitable for high dimensional problems with small number of samples (large p small n). First, we improve on the Ledoit-Wolf (LW) method by conditioning on a sufficient statistic via the Rao-Blackwell theorem, obtaining a new estimator RBLW whose mean-squared error dominates the LW under Gaussian model. Second, to further reduce the estimation error, we propose an iterative approach which approximates the clairvoyant shrinkage estimator. Convergence of this iterative method is proven and a closed form expression for the limit is determined, which is called the OAS estimator. Both of the proposed estimators have simple expressions and are easy to compute. Although the two methods are developed from different approaches, their structure is identical up to specific constants. The RBLW estimator provably dominates the LW method; and numerical simulations demonstrate that the OAS estimator performs even better, especially when n is much less than p.


Y. Chen, A. Wiesel and A. O. Hero, “Shrinkage estimation of high dimensional covariance matrices,” in IEEE Intl Conf. on Acoust., Speech, and Signal Processing, Taiwan, Mar 2009. (.pdf)

Matlab Code

Download (.m) and run demo.m to reproduce the figures.


  • Figure 1. MSE of different estimators. AR(1) process, p = 100, n varies from 5 to 120, r = 0.5.
  • Figure 2. Comparison of shrinkage coefficients of different estimators. AR(1) process, p = 100, n varies from 5 to 120, r = 0.5.

Comments and Remarks

Log In