A fast stochastic gradient algorithm: Maximal use of sparsification benefits under computational constraints

Masahiro Yukawa, Wolfgang Utschick

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithmdesign as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.

Original languageEnglish
Pages (from-to)467-475
Number of pages9
JournalIEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
VolumeE93-A
Issue number2
DOIs
Publication statusPublished - 2010 Feb
Externally publishedYes

Fingerprint

Stochastic Gradient
Gradient Algorithm
Stochastic Algorithms
Adaptive filtering
Constrained optimization
Adaptive Filtering
Constrained Optimization Problem
Closed-form Solution
Maximise
Numerical Examples
Optimization
Energy
Demonstrate

Keywords

  • Adaptive filter
  • Proportionate adaptive filtering
  • Stochastic gradient algorithm

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Graphics and Computer-Aided Design
  • Applied Mathematics
  • Signal Processing

Cite this

@article{e44bda99a1874f72b67acbd31b3d5d62,
title = "A fast stochastic gradient algorithm: Maximal use of sparsification benefits under computational constraints",
abstract = "In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithmdesign as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.",
keywords = "Adaptive filter, Proportionate adaptive filtering, Stochastic gradient algorithm",
author = "Masahiro Yukawa and Wolfgang Utschick",
year = "2010",
month = "2",
doi = "10.1587/transfun.E93.A.467",
language = "English",
volume = "E93-A",
pages = "467--475",
journal = "IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences",
issn = "0916-8508",
publisher = "Maruzen Co., Ltd/Maruzen Kabushikikaisha",
number = "2",

}

TY - JOUR

T1 - A fast stochastic gradient algorithm

T2 - Maximal use of sparsification benefits under computational constraints

AU - Yukawa, Masahiro

AU - Utschick, Wolfgang

PY - 2010/2

Y1 - 2010/2

N2 - In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithmdesign as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.

AB - In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithmdesign as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.

KW - Adaptive filter

KW - Proportionate adaptive filtering

KW - Stochastic gradient algorithm

UR - http://www.scopus.com/inward/record.url?scp=77950831017&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77950831017&partnerID=8YFLogxK

U2 - 10.1587/transfun.E93.A.467

DO - 10.1587/transfun.E93.A.467

M3 - Article

AN - SCOPUS:77950831017

VL - E93-A

SP - 467

EP - 475

JO - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

JF - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

SN - 0916-8508

IS - 2

ER -