Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights

Masa Aki Takizawa, Masahiro Yukawa

Research output: Contribution to journalArticle

16 Citations (Scopus)

Abstract

First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.

Original languageEnglish
Article number7465845
Pages (from-to)4337-4350
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume64
Issue number16
DOIs
Publication statusPublished - 2016 Aug 15

Fingerprint

Adaptive filters
Glossaries
Refining
Adaptive filtering
Surface analysis
Cost functions

Keywords

  • adaptive proximal forward-backward splitting algorithm
  • Kernel adaptive filtering
  • reproducing kernel Hilbert space
  • {\ell }-{1} norm regularization

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights. / Takizawa, Masa Aki; Yukawa, Masahiro.

In: IEEE Transactions on Signal Processing, Vol. 64, No. 16, 7465845, 15.08.2016, p. 4337-4350.

Research output: Contribution to journalArticle

@article{e558dd6d33d54827b80ce9e4d2c0a30b,
title = "Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights",
abstract = "First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.",
keywords = "adaptive proximal forward-backward splitting algorithm, Kernel adaptive filtering, reproducing kernel Hilbert space, {\ell }-{1} norm regularization",
author = "Takizawa, {Masa Aki} and Masahiro Yukawa",
year = "2016",
month = "8",
day = "15",
doi = "10.1109/TSP.2016.2563401",
language = "English",
volume = "64",
pages = "4337--4350",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "16",

}

TY - JOUR

T1 - Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights

AU - Takizawa, Masa Aki

AU - Yukawa, Masahiro

PY - 2016/8/15

Y1 - 2016/8/15

N2 - First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.

AB - First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.

KW - adaptive proximal forward-backward splitting algorithm

KW - Kernel adaptive filtering

KW - reproducing kernel Hilbert space

KW - {\ell }-{1} norm regularization

UR - http://www.scopus.com/inward/record.url?scp=84980416751&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84980416751&partnerID=8YFLogxK

U2 - 10.1109/TSP.2016.2563401

DO - 10.1109/TSP.2016.2563401

M3 - Article

AN - SCOPUS:84980416751

VL - 64

SP - 4337

EP - 4350

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 16

M1 - 7465845

ER -