An efficient kernel normalized least mean square algorithm with compactly supported kernel

Osamu Toda, Masahiro Yukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We investigate the use of compactly supported kernels (CSKs) for the kernel normalized least mean square (KNLMS) algorithm proposed initially by Richard et al. in 2009. The use of CSKs yields sparse kernelized input vectors, offering an opportunity for complexity reduction. We propose a simple two-step method to compute the kernelized input vectors efficiently. In the first step, it computes an over-estimation of the support of the kernelized input vector based on a certain ℓ1-ball. In the second step, it identifies the exact support by detailed examinations based on an ℓ2-ball. Also, we employ the identified support given by the second step for coherence construction. The proposed method reduces the amount of ℓ2-distance evaluations, leading to the complexity reduction. The numerical examples show that the proposed algorithm achieves significant complexity reduction.

Original languageEnglish
Title of host publicationICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3367-3371
Number of pages5
Volume2015-August
ISBN (Print)9781467369978
DOIs
Publication statusPublished - 2015 Aug 4
Event40th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2015 - Brisbane, Australia
Duration: 2014 Apr 192014 Apr 24

Other

Other40th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2015
CountryAustralia
CityBrisbane
Period14/4/1914/4/24

Keywords

  • Compactly supported function
  • Gaussian kernel
  • Kernel learning
  • Positive definite function
  • Radial basis function

ASJC Scopus subject areas

  • Signal Processing
  • Software
  • Electrical and Electronic Engineering

Cite this

Toda, O., & Yukawa, M. (2015). An efficient kernel normalized least mean square algorithm with compactly supported kernel. In ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings (Vol. 2015-August, pp. 3367-3371). [7178595] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2015.7178595

An efficient kernel normalized least mean square algorithm with compactly supported kernel. / Toda, Osamu; Yukawa, Masahiro.

ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. Vol. 2015-August Institute of Electrical and Electronics Engineers Inc., 2015. p. 3367-3371 7178595.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Toda, O & Yukawa, M 2015, An efficient kernel normalized least mean square algorithm with compactly supported kernel. in ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. vol. 2015-August, 7178595, Institute of Electrical and Electronics Engineers Inc., pp. 3367-3371, 40th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2015, Brisbane, Australia, 14/4/19. https://doi.org/10.1109/ICASSP.2015.7178595
Toda O, Yukawa M. An efficient kernel normalized least mean square algorithm with compactly supported kernel. In ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. Vol. 2015-August. Institute of Electrical and Electronics Engineers Inc. 2015. p. 3367-3371. 7178595 https://doi.org/10.1109/ICASSP.2015.7178595
Toda, Osamu ; Yukawa, Masahiro. / An efficient kernel normalized least mean square algorithm with compactly supported kernel. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. Vol. 2015-August Institute of Electrical and Electronics Engineers Inc., 2015. pp. 3367-3371
@inproceedings{e87dae27ab804538811ebde00837f279,
title = "An efficient kernel normalized least mean square algorithm with compactly supported kernel",
abstract = "We investigate the use of compactly supported kernels (CSKs) for the kernel normalized least mean square (KNLMS) algorithm proposed initially by Richard et al. in 2009. The use of CSKs yields sparse kernelized input vectors, offering an opportunity for complexity reduction. We propose a simple two-step method to compute the kernelized input vectors efficiently. In the first step, it computes an over-estimation of the support of the kernelized input vector based on a certain ℓ1-ball. In the second step, it identifies the exact support by detailed examinations based on an ℓ2-ball. Also, we employ the identified support given by the second step for coherence construction. The proposed method reduces the amount of ℓ2-distance evaluations, leading to the complexity reduction. The numerical examples show that the proposed algorithm achieves significant complexity reduction.",
keywords = "Compactly supported function, Gaussian kernel, Kernel learning, Positive definite function, Radial basis function",
author = "Osamu Toda and Masahiro Yukawa",
year = "2015",
month = "8",
day = "4",
doi = "10.1109/ICASSP.2015.7178595",
language = "English",
isbn = "9781467369978",
volume = "2015-August",
pages = "3367--3371",
booktitle = "ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - An efficient kernel normalized least mean square algorithm with compactly supported kernel

AU - Toda, Osamu

AU - Yukawa, Masahiro

PY - 2015/8/4

Y1 - 2015/8/4

N2 - We investigate the use of compactly supported kernels (CSKs) for the kernel normalized least mean square (KNLMS) algorithm proposed initially by Richard et al. in 2009. The use of CSKs yields sparse kernelized input vectors, offering an opportunity for complexity reduction. We propose a simple two-step method to compute the kernelized input vectors efficiently. In the first step, it computes an over-estimation of the support of the kernelized input vector based on a certain ℓ1-ball. In the second step, it identifies the exact support by detailed examinations based on an ℓ2-ball. Also, we employ the identified support given by the second step for coherence construction. The proposed method reduces the amount of ℓ2-distance evaluations, leading to the complexity reduction. The numerical examples show that the proposed algorithm achieves significant complexity reduction.

AB - We investigate the use of compactly supported kernels (CSKs) for the kernel normalized least mean square (KNLMS) algorithm proposed initially by Richard et al. in 2009. The use of CSKs yields sparse kernelized input vectors, offering an opportunity for complexity reduction. We propose a simple two-step method to compute the kernelized input vectors efficiently. In the first step, it computes an over-estimation of the support of the kernelized input vector based on a certain ℓ1-ball. In the second step, it identifies the exact support by detailed examinations based on an ℓ2-ball. Also, we employ the identified support given by the second step for coherence construction. The proposed method reduces the amount of ℓ2-distance evaluations, leading to the complexity reduction. The numerical examples show that the proposed algorithm achieves significant complexity reduction.

KW - Compactly supported function

KW - Gaussian kernel

KW - Kernel learning

KW - Positive definite function

KW - Radial basis function

UR - http://www.scopus.com/inward/record.url?scp=84946045881&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84946045881&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2015.7178595

DO - 10.1109/ICASSP.2015.7178595

M3 - Conference contribution

AN - SCOPUS:84946045881

SN - 9781467369978

VL - 2015-August

SP - 3367

EP - 3371

BT - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -