Login / Signup

Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference.

T Tony CaiAnru R ZhangYuchen Zhou
Published in: IEEE transactions on information theory (2022)
We study sparse group Lasso for high-dimensional double sparse linear regression, where the parameter of interest is simultaneously element-wise and group-wise sparse. This problem is an important instance of the simultaneously structured model - an actively studied topic in statistics and machine learning. In the noiseless case, matching upper and lower bounds on sample complexity are established for the exact recovery of sparse vectors and for stable estimation of approximately sparse vectors, respectively. In the noisy case, upper and matching minimax lower bounds for estimation error are obtained. We also consider the debiased sparse group Lasso and investigate its asymptotic property for the purpose of statistical inference. Finally, numerical studies are provided to support the theoretical results.
Keyphrases
  • neural network
  • machine learning
  • single cell
  • artificial intelligence
  • big data
  • deep learning
  • gene therapy