TY - JOUR
T1 - Linearly-Involved Moreau-Enhanced-Over-Subspace Model
T2 - Debiased Sparse Modeling and Stable Outlier-Robust Regression
AU - Yukawa, Masahiro
AU - Kaneko, Hiroyuki
AU - Suzuki, Kyohei
AU - Yamada, Isao
N1 - Funding Information:
This work was supported by JSPS under Grants-in-Aid 22H01492.
Publisher Copyright:
© 1991-2012 IEEE.
PY - 2023
Y1 - 2023
N2 - We present an efficient mathematical framework to derive promising methods that enjoy 'enhanced' desirable properties. The popular minimax concave penalty for sparse modeling subtracts, from the ℓ 1 norm, its Moreau envelope, inducing nearly unbiased estimates and thus yielding considerable performance enhancements. To extend it to underdetermined linear systems, we propose the projective minimax concave penalty, which leads to 'enhanced' sparseness over the input subspace. We also present a promising regression method which has an 'enhanced' robustness and substantial stability by distinguishing outlier and noise explicitly. The proposed framework, named the linearly-involved Moreau-enhanced-over-subspace (LiMES) model, encompasses those two specific examples as well as two others: stable principal component pursuit and robust classification. The LiMES function involved in the model is an 'additively nonseparable' weakly convex function, while the 'inner' objective function to define the Moreau envelope is 'separable'. This mixed nature of separability and nonseparability allows an application of the LiMES model to the underdetermined case with an efficient algorithmic implementation. Two linear/affine operators play key roles in the model: one corresponds to the projection mentioned above and the other takes care of robust regression/classification. A necessary and sufficient condition for convexity of the smooth part of the objective function is studied. Numerical examples show the efficacy of LiMES in applications to sparse modeling and robust regression.
AB - We present an efficient mathematical framework to derive promising methods that enjoy 'enhanced' desirable properties. The popular minimax concave penalty for sparse modeling subtracts, from the ℓ 1 norm, its Moreau envelope, inducing nearly unbiased estimates and thus yielding considerable performance enhancements. To extend it to underdetermined linear systems, we propose the projective minimax concave penalty, which leads to 'enhanced' sparseness over the input subspace. We also present a promising regression method which has an 'enhanced' robustness and substantial stability by distinguishing outlier and noise explicitly. The proposed framework, named the linearly-involved Moreau-enhanced-over-subspace (LiMES) model, encompasses those two specific examples as well as two others: stable principal component pursuit and robust classification. The LiMES function involved in the model is an 'additively nonseparable' weakly convex function, while the 'inner' objective function to define the Moreau envelope is 'separable'. This mixed nature of separability and nonseparability allows an application of the LiMES model to the underdetermined case with an efficient algorithmic implementation. Two linear/affine operators play key roles in the model: one corresponds to the projection mentioned above and the other takes care of robust regression/classification. A necessary and sufficient condition for convexity of the smooth part of the objective function is studied. Numerical examples show the efficacy of LiMES in applications to sparse modeling and robust regression.
KW - Convex optimization
KW - Moreau envelope
KW - proximity operator
KW - weakly convex function
UR - http://www.scopus.com/inward/record.url?scp=85153395150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85153395150&partnerID=8YFLogxK
U2 - 10.1109/TSP.2023.3263724
DO - 10.1109/TSP.2023.3263724
M3 - Article
AN - SCOPUS:85153395150
SN - 1053-587X
VL - 71
SP - 1232
EP - 1247
JO - IEEE Transactions on Acoustics, Speech, and Signal Processing
JF - IEEE Transactions on Acoustics, Speech, and Signal Processing
ER -