Contents Online
Statistics and Its Interface
Volume 7 (2014)
Number 4
Special Issue on Modern Bayesian Statistics (Part I)
Guest Editor: Ming-Hui Chen (University of Connecticut)
A new Bayesian lasso
Pages: 571 – 582
DOI: https://dx.doi.org/10.4310/SII.2014.v7.n4.a12
Authors
Abstract
Park and Casella (2008) provided the Bayesian lasso for linear models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. In this paper, we propose an alternative Bayesian analysis of the lasso problem. A different hierarchical formulation of Bayesian lasso is introduced by utilizing the scale mixture of uniform (SMU) representation of the Laplace density. We consider a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions. Empirical results and real data analyses show that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection. An ECM algorithm is provided to compute the MAP estimates of the parameters. Easy extension to general models is also briefly discussed.
Keywords
lasso, Bayesian lasso, scale mixture of uniform, MCMC, variable selection
2010 Mathematics Subject Classification
62F15
Published 23 December 2014