We study the use of weakly-convex minmax concave (MC) regularizes in distributed sparse optimization. The global cost function is the squared error penalized by the MC regularizer. While it is convex as long as the whole system is overdetermined and the regularization parameter is sufficiently small, the local cost of each node is usually nonconvex as the system from local measurements are underdetermined in practical applications. The Moreau decomposition is applied to the MC regularizer so that the total cost takes the form of a smooth function plus the rescaled ℓ1 norm. We propose two solvers: the first applies the proximal gradient exact first-order algorithm (PG-EXTRA) directly to our cost, while the second is based on convex relaxation of the local costs to ensure convergence. Numerical examples show that the proposed approaches attain significant gains compared to the ℓ1 -based PG-EXTRA.