Statistics and Its Interface

Volume 14 (2021)

Number 3

High-dimensional correlation matrix estimation for Gaussian data: a Bayesian perspective

Pages: 351 – 358

DOI: https://dx.doi.org/10.4310/20-SII655

Authors

Chaojie Wang (Faculty of Science, Jiangsu University, Zhenjiang, China)

Xiaodan Fan (Department of Statistics, Chinese University of Hong Kong)

Abstract

Gaussian covariance or precision matrix estimation is a classical problem in high-dimensional data analyses. For precision matrix estimation, the graphical lasso provides an efficient approach by optimizing the log-likelihood function with $L_1$-norm penalty. Inspired by the success of graphical lasso, researchers pursue analogous outcomes for covariance matrix estimation. However, it suffers from the difficulty of non-convex optimization and a degeneration problem when $p \gt n$ due to the singularity of the sample covariance matrix. In this paper, we fix the degeneration problem by adding an extra constraint on diagonal elements. From the Bayesian perspective, a grid-point gradient descent (GPGD) algorithm together with the block Gibbs sampler is developed to sample from the posterior distribution of the correlation matrix. The algorithm provides an effective approach to draw samples under the positive-definite constraint, and can explore the whole feasible region to attain the mode of the posterior distribution. Simulation studies and a real application demonstrate that our method is competitive with other existing methods in various settings.

Keywords

correlation matrix estimation, positive-definiteness, non-convex optimization, Bayesian analysis

The full text of this article is unavailable through your IP address: 172.17.0.1

This research is partially supported by two grants from the Research Grants Council of the Hong Kong SAR, China (CUHK14173817, CUHK14303819), one CUHK direct grant (4053357) and one UJS direct grant (5501190012).

Received 21 April 2020

Accepted 29 November 2020

Published 9 February 2021