TY - JOUR
T1 - Learning Sparse Graph with Minimax Concave Penalty under Gaussian Markov Random Fields
AU - Koyakumaru, Tatsuya
AU - Yukawa, Masahiro
AU - Pavez, Eduardo
AU - Ortega, Antonio
N1 - Funding Information:
Manuscript received November 23, 2021. Manuscript revised April 1, 2022. Manuscript publicized July 1, 2022. †The authors are with the Department of Electronics and Electrical Engineering, Keio University, Yokohama-shi, 223-8522 Japan. ††The authors are with the Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA 90089-2564, USA. ∗This work was supported by the Grants-in-Aid for Scientific Research (KAKENHI) under Grant JP18H01446. a) E-mail: yukawa@elec.keio.ac.jp DOI: 10.1587/transfun.2021EAP1153
Publisher Copyright:
Copyright © 2023 The Institute of Electronics, Information and Communication Engineers.
PY - 2023/1
Y1 - 2023/1
N2 - This paper presents a convex-analytic framework to learn sparse graphs from data. While our problem formulation is inspired by an extension of the graphical lasso using the so-called combinatorial graph Laplacian framework, a key difference is the use of a nonconvex alternative to the `1 norm to attain graphs with better interpretability. Specifically, we use the weakly-convex minimax concave penalty (the difference between the `1 norm and the Huber function) which is known to yield sparse solutions with lower estimation bias than `1 for regression problems. In our framework, the graph Laplacian is replaced in the optimization by a linear transform of the vector corresponding to its upper triangular part. Via a reformulation relying on Moreau’s decomposition, we show that overall convexity is guaranteed by introducing a quadratic function to our cost function. The problem can be solved efficiently by the primal-dual splitting method, of which the admissible conditions for provable convergence are presented. Numerical examples show that the proposed method significantly outperforms the existing graph learning methods with reasonable computation time.
AB - This paper presents a convex-analytic framework to learn sparse graphs from data. While our problem formulation is inspired by an extension of the graphical lasso using the so-called combinatorial graph Laplacian framework, a key difference is the use of a nonconvex alternative to the `1 norm to attain graphs with better interpretability. Specifically, we use the weakly-convex minimax concave penalty (the difference between the `1 norm and the Huber function) which is known to yield sparse solutions with lower estimation bias than `1 for regression problems. In our framework, the graph Laplacian is replaced in the optimization by a linear transform of the vector corresponding to its upper triangular part. Via a reformulation relying on Moreau’s decomposition, we show that overall convexity is guaranteed by introducing a quadratic function to our cost function. The problem can be solved efficiently by the primal-dual splitting method, of which the admissible conditions for provable convergence are presented. Numerical examples show that the proposed method significantly outperforms the existing graph learning methods with reasonable computation time.
KW - graph learning
KW - graph signal processing
KW - minimax concave penalty
KW - primal-dual splitting method
KW - proximity operator
UR - http://www.scopus.com/inward/record.url?scp=85145777094&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85145777094&partnerID=8YFLogxK
U2 - 10.1587/transfun.2021EAP1153
DO - 10.1587/transfun.2021EAP1153
M3 - Article
AN - SCOPUS:85145777094
SN - 0916-8508
VL - E106A
SP - 23
EP - 34
JO - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
JF - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
IS - 1
ER -