This is a repo for our paper: Analysis of Sparse Group LASSO via AMP.
Sparse Group LASSO (SGL) is a class of convex linear regression problems, including Lasso and Group Lasso as special cases. Its penalty contains both L1 and L2 norm to guarantee sparsity on inter-group and within-group levels. We develop proximal gradient methods (including AMP) as optimizers to solve SGL, which outperform the widely-used blockwise descent method in terms of complexity and convergence rate. We show that AMP not only converges much faster, but also characterizes the solution in distribution. Such characterization allows further analysis of SGL, e.g. false discovery rate, mean squared error and the effect of group information.