TY - GEN
T1 - Sparse Stable Outlier-Robust Regression with Minimax Concave Function
AU - Suzuki, Kyohei
AU - Yukawa, Masahiro
N1 - Funding Information:
This work was supported by the Grants-in-Aid for Scientific Research (KAKENHI) under Grant Numbers 22J22588 and 22H01492.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - We propose a novel formulation for stable sparse recovery from measurements contaminated by outliers and severe noise. The proposed formulation evaluates noise and outliers with a quadratic function and the minimax concave function, respectively, to reflect their statistical properties (Gaussianity and sparsity). This makes a significant difference from the conventional robust methods, which typically evaluate noise and outliers with a single loss function, leading to stability of the estimate. While the proposed formulation involves a nonconvex penalty to reduce estimation biases of sparse estimates, overall convexity of the whole cost is guaranteed under a certain condition by adding the Tikhonov regularization term. The problem is solved via a reformulation by the forward-backward primal-dual splitting algorithm, for which convergence conditions are derived. The remarkable outlier-robustness of the proposed method is demonstrated by simulations under highly noisy environments.
AB - We propose a novel formulation for stable sparse recovery from measurements contaminated by outliers and severe noise. The proposed formulation evaluates noise and outliers with a quadratic function and the minimax concave function, respectively, to reflect their statistical properties (Gaussianity and sparsity). This makes a significant difference from the conventional robust methods, which typically evaluate noise and outliers with a single loss function, leading to stability of the estimate. While the proposed formulation involves a nonconvex penalty to reduce estimation biases of sparse estimates, overall convexity of the whole cost is guaranteed under a certain condition by adding the Tikhonov regularization term. The problem is solved via a reformulation by the forward-backward primal-dual splitting algorithm, for which convergence conditions are derived. The remarkable outlier-robustness of the proposed method is demonstrated by simulations under highly noisy environments.
KW - convex optimization
KW - mini- max concave function
KW - robust regression
KW - robust sparse recovery
KW - sparse modeling
UR - http://www.scopus.com/inward/record.url?scp=85141011530&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85141011530&partnerID=8YFLogxK
U2 - 10.1109/MLSP55214.2022.9943378
DO - 10.1109/MLSP55214.2022.9943378
M3 - Conference contribution
AN - SCOPUS:85141011530
T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP
BT - 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing, MLSP 2022
PB - IEEE Computer Society
T2 - 32nd IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2022
Y2 - 22 August 2022 through 25 August 2022
ER -