Contents Online
Communications in Mathematical Sciences
Volume 18 (2020)
Number 1
An efficient and globally convergent algorithm for $\ell_{p,q} - \ell_r$ model in group sparse optimization
Pages: 227 – 258
DOI: https://dx.doi.org/10.4310/CMS.2020.v18.n1.a10
Authors
Abstract
Group sparsity has lots of applications in various data science related problems. It combines the underlying sparsity and group structure of the variables. A general and important model for group sparsity is the $\ell_{p,q} - \ell_r$ optimization model with $p \geq 1, 0 \lt q \lt 1, 1 \leq r \leq \infty$, which is applicable to different types of measurement noises. It includes not only the non-smooth composition of $\ell_q (0 \lt q \lt 1)$ and $\ell_p (p \geq 1)$, but also the non-smooth $\ell_1 / \ell_{\infty}$ fidelity term. In this paper, we present a nontrivial extension of our recent work to solve this general group sparse minimization model. By a motivating proposition, our algorithm is naturally designed to shrink the group support and eliminate the variables gradually. It is thus very fast, especially for large-scale problems. Combined with a proximal linearization, it allows an inexact inner loop implemented by scaled alternating direction method of multipliers (ADMM), and still has global convergence. The algorithm gives a unified framework for the full parameters. Many numerical experiments are presented for various combinations of the parameters $p,q,r$. The comparisons show the advantages of our algorithm over others in the existing works.
Keywords
group sparse, $\ell_{p,q}-\ell_r$ model, non-Lipschitz optimization, Laplace noise, Gaussian noise, uniform distribution noise, lower bound theory, Kurdyka–Łojasiewicz (KL) property
2010 Mathematics Subject Classification
49M05, 65K10, 90C26, 90C30
Received 13 July 2019
Accepted 17 September 2019
Published 1 April 2020