Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings

Isao Yamada, Masahiro Yukawa, Masao Yamagishi

Research output: Chapter in Book/Report/Conference proceedingChapter

54 Citations (Scopus)

Abstract

The first aim of this paper is to present a useful toolbox of quasinonexpansive mappings for convex optimization from the viewpoint of using their fixed point sets as constraints. Many convex optimization problems have been solved through elegant translations into fixed point problems. The underlying principle is to operate a certain quasi-nonexpansivemapping T iteratively and generate a convergent sequence to its fixed point. However, such a mapping often has infinitely many fixed points, meaning that a selection from the fixed point set Fix(T) should be of great importance. Nevertheless, most fixed point methods can only return an “unspecified” point from the fixed point set, which requires many iterations. Therefore, based on common sense, it seems unrealistic to wish for an “optimal” one from the fixed point set. Fortunately, considering the collection of quasi-nonexpansive mappings as a toolbox, we can accomplish this challenging mission simply by the hybrid steepest descent method, provided that the cost function is smooth and its derivative is Lipschitz continuous. A question arises: how can we deal with “nonsmooth” cost functions? The second aim is to propose a nontrivial integration of the ideas of the hybrid steepest descent method and the Moreau-Yosida regularization, yielding a useful approach to the challenging problem of nonsmooth convex optimization over Fix(T). The key is the use of smoothing of the original nonsmooth cost function by its Moreau-Yosida regularization whose derivative is always Lipschitz continuous. The field of application of hybrid steepest descent method can be extended to the minimization of the ideal smooth approximation over Fix(T). We present the mathematical ideas of the proposed approach together with its application to a combinatorial optimization problem: the minimal antenna-subset selection problem under a highly nonlinear capacity-constraint for efficient multiple input multiple output (MIMO) communication systems.

Original languageEnglish
Title of host publicationSpringer Optimization and Its Applications
PublisherSpringer International Publishing
Pages345-390
Number of pages46
Volume49
DOIs
Publication statusPublished - 2011
Externally publishedYes

Publication series

NameSpringer Optimization and Its Applications
Volume49
ISSN (Print)19316828
ISSN (Electronic)19316836

Fingerprint

Moreau Envelope
Nonsmooth Function
Fixed Point Set
Nonexpansive Mapping
Steepest Descent Method
Convex function
Moreau-Yosida Regularization
Convex Optimization
Cost Function
Lipschitz
Fixed point
Smooth Approximation
Antenna Selection
Derivative
Subset Selection
Fixed Point Method
Nonsmooth Optimization
Capacity Constraints
Nonlinear Constraints
Multiple-input multiple-output (MIMO) Systems

Keywords

  • Hybrid steepest descent method
  • Moreau envelope
  • Nonsmooth convex optimization

ASJC Scopus subject areas

  • Control and Optimization

Cite this

Yamada, I., Yukawa, M., & Yamagishi, M. (2011). Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings. In Springer Optimization and Its Applications (Vol. 49, pp. 345-390). (Springer Optimization and Its Applications; Vol. 49). Springer International Publishing. https://doi.org/10.1007/978-1-4419-9569-8_17

Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings. / Yamada, Isao; Yukawa, Masahiro; Yamagishi, Masao.

Springer Optimization and Its Applications. Vol. 49 Springer International Publishing, 2011. p. 345-390 (Springer Optimization and Its Applications; Vol. 49).

Research output: Chapter in Book/Report/Conference proceedingChapter

Yamada, I, Yukawa, M & Yamagishi, M 2011, Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings. in Springer Optimization and Its Applications. vol. 49, Springer Optimization and Its Applications, vol. 49, Springer International Publishing, pp. 345-390. https://doi.org/10.1007/978-1-4419-9569-8_17
Yamada I, Yukawa M, Yamagishi M. Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings. In Springer Optimization and Its Applications. Vol. 49. Springer International Publishing. 2011. p. 345-390. (Springer Optimization and Its Applications). https://doi.org/10.1007/978-1-4419-9569-8_17
Yamada, Isao ; Yukawa, Masahiro ; Yamagishi, Masao. / Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings. Springer Optimization and Its Applications. Vol. 49 Springer International Publishing, 2011. pp. 345-390 (Springer Optimization and Its Applications).
@inbook{01990a05e5ad4cfe8ea8c7d6210d44c3,
title = "Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings",
abstract = "The first aim of this paper is to present a useful toolbox of quasinonexpansive mappings for convex optimization from the viewpoint of using their fixed point sets as constraints. Many convex optimization problems have been solved through elegant translations into fixed point problems. The underlying principle is to operate a certain quasi-nonexpansivemapping T iteratively and generate a convergent sequence to its fixed point. However, such a mapping often has infinitely many fixed points, meaning that a selection from the fixed point set Fix(T) should be of great importance. Nevertheless, most fixed point methods can only return an “unspecified” point from the fixed point set, which requires many iterations. Therefore, based on common sense, it seems unrealistic to wish for an “optimal” one from the fixed point set. Fortunately, considering the collection of quasi-nonexpansive mappings as a toolbox, we can accomplish this challenging mission simply by the hybrid steepest descent method, provided that the cost function is smooth and its derivative is Lipschitz continuous. A question arises: how can we deal with “nonsmooth” cost functions? The second aim is to propose a nontrivial integration of the ideas of the hybrid steepest descent method and the Moreau-Yosida regularization, yielding a useful approach to the challenging problem of nonsmooth convex optimization over Fix(T). The key is the use of smoothing of the original nonsmooth cost function by its Moreau-Yosida regularization whose derivative is always Lipschitz continuous. The field of application of hybrid steepest descent method can be extended to the minimization of the ideal smooth approximation over Fix(T). We present the mathematical ideas of the proposed approach together with its application to a combinatorial optimization problem: the minimal antenna-subset selection problem under a highly nonlinear capacity-constraint for efficient multiple input multiple output (MIMO) communication systems.",
keywords = "Hybrid steepest descent method, Moreau envelope, Nonsmooth convex optimization",
author = "Isao Yamada and Masahiro Yukawa and Masao Yamagishi",
year = "2011",
doi = "10.1007/978-1-4419-9569-8_17",
language = "English",
volume = "49",
series = "Springer Optimization and Its Applications",
publisher = "Springer International Publishing",
pages = "345--390",
booktitle = "Springer Optimization and Its Applications",

}

TY - CHAP

T1 - Minimizing the moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings

AU - Yamada, Isao

AU - Yukawa, Masahiro

AU - Yamagishi, Masao

PY - 2011

Y1 - 2011

N2 - The first aim of this paper is to present a useful toolbox of quasinonexpansive mappings for convex optimization from the viewpoint of using their fixed point sets as constraints. Many convex optimization problems have been solved through elegant translations into fixed point problems. The underlying principle is to operate a certain quasi-nonexpansivemapping T iteratively and generate a convergent sequence to its fixed point. However, such a mapping often has infinitely many fixed points, meaning that a selection from the fixed point set Fix(T) should be of great importance. Nevertheless, most fixed point methods can only return an “unspecified” point from the fixed point set, which requires many iterations. Therefore, based on common sense, it seems unrealistic to wish for an “optimal” one from the fixed point set. Fortunately, considering the collection of quasi-nonexpansive mappings as a toolbox, we can accomplish this challenging mission simply by the hybrid steepest descent method, provided that the cost function is smooth and its derivative is Lipschitz continuous. A question arises: how can we deal with “nonsmooth” cost functions? The second aim is to propose a nontrivial integration of the ideas of the hybrid steepest descent method and the Moreau-Yosida regularization, yielding a useful approach to the challenging problem of nonsmooth convex optimization over Fix(T). The key is the use of smoothing of the original nonsmooth cost function by its Moreau-Yosida regularization whose derivative is always Lipschitz continuous. The field of application of hybrid steepest descent method can be extended to the minimization of the ideal smooth approximation over Fix(T). We present the mathematical ideas of the proposed approach together with its application to a combinatorial optimization problem: the minimal antenna-subset selection problem under a highly nonlinear capacity-constraint for efficient multiple input multiple output (MIMO) communication systems.

AB - The first aim of this paper is to present a useful toolbox of quasinonexpansive mappings for convex optimization from the viewpoint of using their fixed point sets as constraints. Many convex optimization problems have been solved through elegant translations into fixed point problems. The underlying principle is to operate a certain quasi-nonexpansivemapping T iteratively and generate a convergent sequence to its fixed point. However, such a mapping often has infinitely many fixed points, meaning that a selection from the fixed point set Fix(T) should be of great importance. Nevertheless, most fixed point methods can only return an “unspecified” point from the fixed point set, which requires many iterations. Therefore, based on common sense, it seems unrealistic to wish for an “optimal” one from the fixed point set. Fortunately, considering the collection of quasi-nonexpansive mappings as a toolbox, we can accomplish this challenging mission simply by the hybrid steepest descent method, provided that the cost function is smooth and its derivative is Lipschitz continuous. A question arises: how can we deal with “nonsmooth” cost functions? The second aim is to propose a nontrivial integration of the ideas of the hybrid steepest descent method and the Moreau-Yosida regularization, yielding a useful approach to the challenging problem of nonsmooth convex optimization over Fix(T). The key is the use of smoothing of the original nonsmooth cost function by its Moreau-Yosida regularization whose derivative is always Lipschitz continuous. The field of application of hybrid steepest descent method can be extended to the minimization of the ideal smooth approximation over Fix(T). We present the mathematical ideas of the proposed approach together with its application to a combinatorial optimization problem: the minimal antenna-subset selection problem under a highly nonlinear capacity-constraint for efficient multiple input multiple output (MIMO) communication systems.

KW - Hybrid steepest descent method

KW - Moreau envelope

KW - Nonsmooth convex optimization

UR - http://www.scopus.com/inward/record.url?scp=84976466459&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84976466459&partnerID=8YFLogxK

U2 - 10.1007/978-1-4419-9569-8_17

DO - 10.1007/978-1-4419-9569-8_17

M3 - Chapter

AN - SCOPUS:84976466459

VL - 49

T3 - Springer Optimization and Its Applications

SP - 345

EP - 390

BT - Springer Optimization and Its Applications

PB - Springer International Publishing

ER -