The full text of this article is unavailable through your IP address: 18.191.205.110
Contents Online
Statistics and Its Interface
Volume 17 (2024)
Number 3
Debiased distributed quantile regression in high dimensions
Pages: 337 – 347
DOI: https://dx.doi.org/10.4310/22-SII759
Authors
Abstract
This paper concerns the debiased distributed estimation for the linear model in high dimensions with arbitrary noise distribution. Quantile regression (QR) is adopted to safeguard potential heavy-tailed noises. To tackle the computational challenges accompanied by the non-smooth QR loss, we cast the QR loss into a least-squares loss by constructing new pseudo responses. We further equip the new least-squares loss with the $\ell_1$ penalty to accomplish tasks of coefficient estimation and variable selection. To eliminate the bias brought by the $\ell_1$ penalty, we correct the bias of nonzero coefficient estimation for each local machine and aggregate all the local debiased estimators through averaging. Our distributed algorithm is guaranteed to converge in a finite number of iterations. Theoretically, we show that the resulting estimator can consistently recover the sparsity pattern and achieve a near-oracle convergence rate. We conduct extensive numerical studies to demonstrate the competitive finite sample performance of our method.
Keywords
distributed estimation, lasso, high-dimensional quantile regression, bias-correction
Received 13 July 2022
Accepted 12 September 2022
Published 19 July 2024