Distributed Sparse Optimization With Weakly Convex Regularizer: Consensus Promoting and Approximate Moreau Enhanced Penalties Towards Global Optimality

Kei Komuro, Masahiro Yukawa, Renato Luis Garrido Cavalcante

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a promising framework for distributed sparse optimization based on weakly convex regularizers. More specifically, we pose two distributed optimization problems to recover sparse signals in networks. The first problem formulation relies on statistical properties of the signals, and it uses an approximate Moreau enhanced penalty. In contrast, the second formulation does not rely on any statistical assumptions, and it uses an additional consensus promoting penalty (CPP) that convexifies the cost function over the whole network. To solve both problems, we propose a distributed proximal debiasing-gradient (DPD) method, which uses the exact first-order proximal gradient algorithm. The DPD method features a pair of proximity operators that play complementary roles: one sparsifies the estimate, and the other reduces the bias caused by the sparsification. Owing to the overall convexity of the whole cost functions, the proposed method guarantees convergence to a global minimizer, as demonstrated by numerical examples. In addition, the use of CPP improves the convergence speed significantly.

Original languageEnglish
Pages (from-to)514-527
Number of pages14
JournalIEEE Transactions on Signal and Information Processing over Networks
Volume8
DOIs
Publication statusPublished - 2022

Keywords

  • Distributed optimization
  • sparse optimization
  • weakly convex regularizer

ASJC Scopus subject areas

  • Signal Processing
  • Information Systems
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Distributed Sparse Optimization With Weakly Convex Regularizer: Consensus Promoting and Approximate Moreau Enhanced Penalties Towards Global Optimality'. Together they form a unique fingerprint.

Cite this