Abstract
First, we study a basic kernel adaptive filtering algorithm based on stochastic restricted-gradient , which is a natural extension of the naive online regularized risk minimization algorithm (NORMA). Our error surface analysis shows that the algorithm has a decorrelation property due to its explicit use of kernel matrix. It turns out that its normalized version projects the current coefficient vector onto the same hyperplane, but with a different metric, as the kernel normalized least mean square (KNLMS) algorithm. This gives us a fundamental insight bridging two classes of kernel adaptive filtering algorithms. Second, we use this insight to derive an efficient forward-backward splitting algorithm. The same metric as used for the normalized algorithm is employed in the forward step, whereas the ordinary Euclidean metric is employed in the backward step, leading to efficient adaptive refinements of the filter dictionary. We show a monotone approximation property of the proposed algorithm with respect to some modified cost function under certain conditions. Numerical examples show the efficacy of the proposed algorithm.
Original language | English |
---|---|
Article number | 7465845 |
Pages (from-to) | 4337-4350 |
Number of pages | 14 |
Journal | IEEE Transactions on Signal Processing |
Volume | 64 |
Issue number | 16 |
DOIs | |
Publication status | Published - 2016 Aug 15 |
Keywords
- Kernel adaptive filtering
- adaptive proximal forward-backward splitting algorithm
- reproducing kernel Hilbert space
- {\ell }-{1} norm regularization
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering