Minkovskian gradient for sparse optimization

Shun Ichi Amari, Masahiro Yukawa

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Information geometry is used to elucidate convex optimization problems under L1 constraint. A convex function induces a Riemannian metric and two dually coupled affine connections in the manifold of parameters of interest. A generalized Pythagorean theorem and projection theorem hold in such a manifold. An extended LARS algorithm, applicable to both under-determined and over-determined cases, is studied and properties of its solution path are given. The algorithm is shown to be a Minkovskian gradient-descent method, which moves in the steepest direction of a target function under the Minkovskian L 1 norm. Two dually coupled affine coordinate systems are useful for analyzing the solution path.

Original languageEnglish
Article number6414587
Pages (from-to)576-585
Number of pages10
JournalIEEE Journal on Selected Topics in Signal Processing
Volume7
Issue number4
DOIs
Publication statusPublished - 2013

Keywords

  • Extended LARS
  • L1-constraint
  • information geometry
  • sparse convex optimization

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Minkovskian gradient for sparse optimization'. Together they form a unique fingerprint.

  • Cite this