Contents Online
Statistics and Its Interface
Volume 2 (2009)
Number 4
Robust and sparse bridge regression
Pages: 481 – 491
DOI: https://dx.doi.org/10.4310/SII.2009.v2.n4.a9
Authors
Abstract
It is known that when there are heavy-tailed errors or outliers in the response, the least squares methods may fail to produce a reliable estimator. In this paper, we proposed a generalized Huber criterion which is highly flexible and robust for large errors. We applied the new criterion to the bridge regression family, called robust and sparse bridge regression (RSBR). However, to get the RSBR solution requires solving a nonconvex minimization problem, which is a computational challenge. On the basis of recent advances in difference convex programming, coordinate descent algorithm and local linear approximation, we provide an efficient computational algorithm that attempts to solve this nonconvex problem. Numerical examples show the proposed RSBR algorithm performs well and suitable for large-scale problems.
Keywords
coordinate descent, D.C. programming, Huber loss, local linear approximation, regularization
Published 1 January 2009