一类混合稀疏组稀疏优化问题的邻近梯度算法-凯发娱乐官网

一类混合稀疏组稀疏优化问题的邻近梯度算法
proximal gradient algorithm for a class of mixed sparse and group sparse optimization problems
doi: , ,    国家自然科学基金支持
作者: 童兴华, 彭定涛*, 张 弦:贵州大学, 数学与统计学院, 贵州 贵阳
关键词: 邻近梯度算法;;;;;;
摘要: 本文研究了一类混合稀疏组稀疏优化问题,其中损失函数为光滑凸函数,正则项为稀疏l1范数与组稀疏lα,p(α ≥ 1, p > 0)范数的组合。 首先,提出了邻近梯度算法求解此混合稀疏组稀疏优化问题。其次,分别讨论了凸(p ≥ 1)和非凸(0 < p < 1)两种情况下算法的收敛性。 最后,对于非凸问 题,在α = 1,时给出组合惩罚项邻近算子的闭式解。 本文结果为求解混合稀疏组稀疏优化问题提供了理论依据和可行途径。
abstract: this paper investigates a class of mixed sparse and group sparse optimization problems in which the loss function is a smooth and convex function, and the regularization term is a combination of the sparse l1 norm and the group sparse lα,p norm (α ≥ 1, p > 0). firstly, we propose a proximal gradient algorithm to solve this class of mixed sparse optimization problems. secondly, we discuss the convergence properties in both convex (p ≥ 1) and non-convex (0 < p < 1) scenarios. for non-convex problems, closed-form solutions of the proximal operators to the mixed penalty can be obtained when α = 1, . the result provides a theoretical foundation and a feasible approach for solving mixed sparse and group sparse optimization problems.
文章引用:童兴华, 彭定涛, 张弦. 一类混合稀疏组稀疏优化问题的邻近梯度算法[j]. 运筹与模糊学, 2023, 13(6): 7598-7611.

参考文献

[1] 刘浩洋, 户将, 李勇锋, 文再文. 最优化: 建模, 算法与理论[m]. 北京: 高等教育出版社, 2020.
[2] li, w., bian, w. and toh, k.-c. (2022) difference-of-convex algorithms for a class of sparse group l0 regularized optimization problems. siam journal on optimization, 32, 1614-1641.
[3] jain, p., rao, n. and dhillon, i.s. (2016) structured sparse regression via greedy hard thresholding. advances in neural information processing systems, 29, 1516-1524.
[4] chen, x., pan, l. and xiu, n. (2023) solution sets of three sparse optimization problems for multivariate regression. journal of global optimization, 87, 347-371.
[5] pan, l. and chen, x. (2021) group sparse optimization for image recovery using capped folded concave functions. siam journal on imaging sciences, 14, 1-25.
[6] zhang, y., zhang, n., sun, d. and toh, k.c. (2020) an efficient hessian-based algorithm for solving large-scale sparse group lasso problems. mathematical programming, 179, 223-263.
[7] zhang, y., zhang, n., sun, d. and toh, k.c. (2020) a proximal point dual newton algorithm for solving group graphical lasso problems. siam journal on optimization, 30, 2197-2220.
[8] peng, d. and chen, x. (2020) computation of second-order directional stationary points for group sparse optimization. optimization methods and software, 35, 348-376.
[9] luo, z., sun, d., toh, k.c. and xiu, n. (2019) solving the oscar and slope models using a semismooth newton-based augmented lagrangian method. journal of machine learning research, 20, 1-25.
[10] chen, x. and toint, p.l. (2021) high-order evaluation complexity for convexly-constrained optimization with non-lipschitzian group sparsity terms. mathematical programming, 187, 47-78.
[11] beck, a. and hallak, n. (2019) optimization problems involving group sparsity terms. math- ematical programming, 178, 39-67.
[12] zhang, y., wei, c. and liu, x. (2022) group logistic regression models with lp,q regulariza- tion. mathematics, 10, article 2227.
[13] cai, t.t., zhang, a.r. and zhou, y.c. (2022) sparse group lasso: optimal sample complex- ity, convergence rate, and statistical inference. ieee transactions on information theory, 68, 5975-6002.
[14] beck, a. and teboulle, m. (2009) a fast iterative shrinkage-thresholding algorithm for linear inverse problems. siam journal on imaging sciences, 2, 183-202.
[15] bolte, j., sabach, s. and teboulle, m. (2014) proximal alternating linearized minimization for nonconvex and nonsmooth problems. mathematical programming, 146, 459-494.
[16] hu, y., li, c., meng, k., qin, j. and yang, x. (2017) group sparse optimization via lp, q regularization. the journal of machine learning research, 70, 960-1011.
为你推荐
凯发娱乐官网的友情链接
网站地图