基于huber损失和组lasso惩罚问题的加速临近梯度算法-凯发娱乐官网

基于huber损失和组lasso惩罚问题的加速临近梯度算法
accelerated proximal gradient algorithm based on huber loss and group lasso penalty problem
doi: , ,    国家自然科学基金支持
作者: 沈慧玲, 彭定涛*, 张 弦:贵州大学数学与统计学院, 贵州 贵阳
关键词: ;;;;;;;
摘要: 在高维线性回归模型中,组稀疏恢复的做法是在原有线性模型的基础上增加一个组lasso惩罚项,以诱导尽可能少的非零组,将问题转换为凸优化模型进行求解。本文研究huber损失和组lasoo 组合问题,其中惩罚项包含了组稀疏惩罚,惩罚项的目的是用于保证组元素稀疏性结构。首先,由于惩罚项是一个凸但不光滑函数,为了刻画组lasso模型的最优性条件,给出了其经典次微分。其次,利用nesterov加速技术提出了加速临近梯度算法来求解我们的模型。最后证明了所提出算法的收敛性。
abstract: in the high-dimensional linear regression model, the method of group sparse recovery is to add a group lasso penalty term to the original linear model so as to induce as few non-zero groups as possible, and then to transform the problem into a convex optimization model. this paper studies the group lasso problem based on huber loss, where the penalty term includes group sparsity penalty, and the purpose of the penalty term is to ensure the group sparsity structure of group element. firstly, since the penalty term is a convex but not smooth function, in order to characterize the optimality conditions of the group lasso model, its classical subdifferential is given. secondly, an accelerated proximity gradient algorithm was proposed using nesterov acceleration technology to solve our model. finally, the convergence of the proposed algorithm was demonstrated.
文章引用:沈慧玲, 彭定涛, 张弦. 基于huber损失和组lasso惩罚问题的加速临近梯度算法[j]. 运筹与模糊学, 2023, 13(6): 7333-7345.

参考文献

[1] natarajan, b. (1995) sparse approximate solutions to linear systems. siam journal on computing, 24, 227-234.
[2] tibshirani, r. (1996) regression shrinkage and selection via the lasso. journal of the royal statistical society: series b, 58, 267-288.
[3] jiao, y., jin, b. and lu, x. (2015) a primal dual active set with continuation algorithm for the 0 regularized optimization problem. applied and computational harmonic analysis, 39, 400-426.
[4] jiao, y., jin, b. and lu, x. (2016) group sparse recovery via the 0 − 2 penalty: theory and algorithm. ieee transactions on signal processing, 65, 998-1012.
[5] yuan, m. and lin, y. (2006) model selection and estimation in regression with grouped variables. journal of the royal statistical society: series b, 68, 49-67.
[6] eren ahsen, m. and vidyasagar, m. (2017) error bounds for compressed sensing algorithms with group sparsity: a unified approach. applied and computational harmonic analysis, 43, 212-232.
[7] cai, t.t., zhang, a.r. and zhou, y.c. (2022) sparse group lasso: optimal sample complex- ity, convergence rate, and statistical inference. ieee transactions on information theory, 68, 5975-6002.
[8] chatterjee, s. and steinhaeuser, k. (2012) sparse group lasso: consistency and climate applications. in: ghosh, j., liu, h., davidson, i., domeniconi, c. and kamath, c., eds., proceedings of the siam international conference on data mining, siam, 47-58.
[9] li, y.m., nan, b. and zhu, j. (2015) multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure. biometrics, 71, 354-363.
[10] li, x., sun, d.f. and toh, k.c. (2018) on efficiently solving the subproblems of a level-set method for fused lasso problems. siam journal on optimization, 28, 1842-1862.
[11] poignard, b. (2020) asymptotic theory of the adaptive sparse group lasso. annals of the institute of statistical mathematics, 72, 297-328.
[12] simon, n., friedman, j. and hastie, t. (2013) a sparse-group lasso. journal of computa- tional and graphical statistics, 22, 231-245.
[13] zhang, y.j., zhang, n. and sun, d.f. (2020) an efficient hessian based algorithm for solving large-scale sparse group lasso problems. mathematical programming, 17, 223-263.
[14] zhang, x. and peng, d.t. (2022) solving constrained nonsmooth group sparse optimization via group capped- 1 relaxation and group smoothing proximal gradient algorithm. computational optimization and applications, 83, 801-844.
[15] friedman, j., hastie, t. and tibshirani, r. (2010) a note on the group lasso and a sparse group lasso.
[16] qin, z., scheinberg, k. and goldfarb, d. (2013) efficient block-coordinate descent algorithms for the group lasso. mathematical programming computation, 5, 143-169.
[17] richt6rik, p. and tak´aˇc, m. (2014) iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. mathematical programming, 144, 1-38.
[18] argyriou, a., micchelli, c.a. and pontil, m. (2011) efficient first order methods for linear composite reularizers.
[19] cand´es, e.j. and plan, y. (2011) a probabilistic and ripless theory of compressed sensing.ieee transactions on information theory, 57, 7235-7254.
[20] cand´es, e.j., romberg, j. and tao, t. (2004) robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. ieee transactions on infor- mation theory, 52, 489-509.
[21] fan, j.q. and li, r.z. (2001) variable selection via nonconcave penalized likelihood and its oracle properties. journal of the american statistical association, 96, 1348-1360.
[22] bauer, f. (2016) proximal newton-type methods for large-scale non-smooth optimization.master’s thesis, technische universita¨t mu¨nchen, mu¨nchen.
[23] peng, d.t. and chen, x. (2020) computation of second-order directional stationary points for group sparse optimization. optimization methods and software, 35, 348-376.
[24] bian, w. and chen, x.j. (2020) a smoothing proximal gradient algorithm for nonsmooth convex regression with cardinality penalty. siam journal on numerical analysis, 58, 858- 883.
[25] micchelli, c.a., shen, l. and xu, y. (2011) proximity algorithms for image models: denois- ing. inverse problems, 27, article 045009.
[26] beck, a. and teboulle, m. (2009) gradient-based algorithms with applications to signal-recovery problems. in: palomar, d.p. and eldar, y.c., eds., convex optimization in signal processing and communications, cambridge university press, cambridge, 42-88.
为你推荐
凯发娱乐官网的友情链接
网站地图