Statistics and Its Interface

Volume 16 (2023)

Number 3

An iterative algorithm with adaptive weights and sparse Laplacian shrinkage for regression problems

Pages: 433 – 443

DOI: https://dx.doi.org/10.4310/22-SII732

Authors

Xingyu Chen (School of Statistics and Mathematics, Central University of Finance and Economics, Bejing, China)

Yuehan Yang (School of Statistics and Mathematics, Central University of Finance and Economics, Bejing, China)

Abstract

This paper considers the regression problem with correlation structures among covariates. We propose an iterative algorithm, named Adaptive Sparse Laplacian Shrinkage (AdaSLS). This algorithm bases on a graph-constrained regularization. In each iteration, an adaptive weight is fitted within the feature space obtained from the previous step. Under suitable regularity conditions, AdaSLS obtains the correct feature set and accurate estimation with high probability. Its bias decay at an exponential rate. Numerical comparisons show that AdaSLS improves the accuracy of both variable selection and estimation. We also apply the proposed algorithm on a gene microarray dataset and a chimeric protein dataset, obtaining meaningful results.

Keywords

high-dimensional data, correlated effects, Laplacian matrix, adaptive weight, iterative algorithm

The full text of this article is unavailable through your IP address: 3.15.26.184

This work was supported by the National Natural Science Foundation of China (Grant No. 12001557); the Youth Talent Development Support Program (QYP202104), the Emerging Interdisciplinary Project, and the Disciplinary Funding of Central University of Finance and Economics.

Received 8 August 2021

Accepted 24 March 2022

Published 14 April 2023