Projection-based dual averaging for stochastic sparse optimization

Asahi Ushio, Masahiro Yukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

We present a variant of the regularized dual averaging (RDA) algorithm for stochastic sparse optimization. Our approach differs from the previous studies of RDA in two respects. First, a sparsity-promoting metric is employed, originated from the proportionate-type adaptive filtering algorithms. Second, the squared-distance function to a closed convex set is employed as a part of the objective functions. In the particular application of online regression, the squared-distance function is reduced to a normalized version of the typical squared-error (least square) function. The two differences yield a better sparsity-seeking capability, leading to improved convergence properties. Numerical examples show the advantages of the proposed algorithm over the existing methods including ADAGRAD and adaptive proximal forward-backward splitting (APFBS).

Original languageEnglish
Title of host publication2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2307-2311
Number of pages5
ISBN (Electronic)9781509041176
DOIs
Publication statusPublished - 2017 Jun 16
Event2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - New Orleans, United States
Duration: 2017 Mar 52017 Mar 9

Other

Other2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017
CountryUnited States
CityNew Orleans
Period17/3/517/3/9

Fingerprint

Adaptive filtering

Keywords

  • online learning
  • orthogonal projection
  • proximity operator
  • sparse optimization
  • stochastic optimization

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Ushio, A., & Yukawa, M. (2017). Projection-based dual averaging for stochastic sparse optimization. In 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings (pp. 2307-2311). [7952568] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2017.7952568

Projection-based dual averaging for stochastic sparse optimization. / Ushio, Asahi; Yukawa, Masahiro.

2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. p. 2307-2311 7952568.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ushio, A & Yukawa, M 2017, Projection-based dual averaging for stochastic sparse optimization. in 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings., 7952568, Institute of Electrical and Electronics Engineers Inc., pp. 2307-2311, 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017, New Orleans, United States, 17/3/5. https://doi.org/10.1109/ICASSP.2017.7952568
Ushio A, Yukawa M. Projection-based dual averaging for stochastic sparse optimization. In 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2017. p. 2307-2311. 7952568 https://doi.org/10.1109/ICASSP.2017.7952568
Ushio, Asahi ; Yukawa, Masahiro. / Projection-based dual averaging for stochastic sparse optimization. 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 2307-2311
@inproceedings{fde22215a4e34c5a8a6d848d6aeeeeb5,
title = "Projection-based dual averaging for stochastic sparse optimization",
abstract = "We present a variant of the regularized dual averaging (RDA) algorithm for stochastic sparse optimization. Our approach differs from the previous studies of RDA in two respects. First, a sparsity-promoting metric is employed, originated from the proportionate-type adaptive filtering algorithms. Second, the squared-distance function to a closed convex set is employed as a part of the objective functions. In the particular application of online regression, the squared-distance function is reduced to a normalized version of the typical squared-error (least square) function. The two differences yield a better sparsity-seeking capability, leading to improved convergence properties. Numerical examples show the advantages of the proposed algorithm over the existing methods including ADAGRAD and adaptive proximal forward-backward splitting (APFBS).",
keywords = "online learning, orthogonal projection, proximity operator, sparse optimization, stochastic optimization",
author = "Asahi Ushio and Masahiro Yukawa",
year = "2017",
month = "6",
day = "16",
doi = "10.1109/ICASSP.2017.7952568",
language = "English",
pages = "2307--2311",
booktitle = "2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Projection-based dual averaging for stochastic sparse optimization

AU - Ushio, Asahi

AU - Yukawa, Masahiro

PY - 2017/6/16

Y1 - 2017/6/16

N2 - We present a variant of the regularized dual averaging (RDA) algorithm for stochastic sparse optimization. Our approach differs from the previous studies of RDA in two respects. First, a sparsity-promoting metric is employed, originated from the proportionate-type adaptive filtering algorithms. Second, the squared-distance function to a closed convex set is employed as a part of the objective functions. In the particular application of online regression, the squared-distance function is reduced to a normalized version of the typical squared-error (least square) function. The two differences yield a better sparsity-seeking capability, leading to improved convergence properties. Numerical examples show the advantages of the proposed algorithm over the existing methods including ADAGRAD and adaptive proximal forward-backward splitting (APFBS).

AB - We present a variant of the regularized dual averaging (RDA) algorithm for stochastic sparse optimization. Our approach differs from the previous studies of RDA in two respects. First, a sparsity-promoting metric is employed, originated from the proportionate-type adaptive filtering algorithms. Second, the squared-distance function to a closed convex set is employed as a part of the objective functions. In the particular application of online regression, the squared-distance function is reduced to a normalized version of the typical squared-error (least square) function. The two differences yield a better sparsity-seeking capability, leading to improved convergence properties. Numerical examples show the advantages of the proposed algorithm over the existing methods including ADAGRAD and adaptive proximal forward-backward splitting (APFBS).

KW - online learning

KW - orthogonal projection

KW - proximity operator

KW - sparse optimization

KW - stochastic optimization

UR - http://www.scopus.com/inward/record.url?scp=85023773290&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85023773290&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2017.7952568

DO - 10.1109/ICASSP.2017.7952568

M3 - Conference contribution

SP - 2307

EP - 2311

BT - 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -