Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights

Masa Aki Takizawa, Masahiro Yukawa

Research output: Contribution to journalArticlepeer-review

26 Citations (Scopus)

Abstract

First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.

Original languageEnglish
Article number7465845
Pages (from-to)4337-4350
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume64
Issue number16
DOIs
Publication statusPublished - 2016 Aug 15

Keywords

  • Kernel adaptive filtering
  • adaptive proximal forward-backward splitting algorithm
  • reproducing kernel Hilbert space
  • {\ell }-{1} norm regularization

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Efficient Dictionary-Refining Kernel Adaptive Filter with Fundamental Insights'. Together they form a unique fingerprint.

Cite this