Online Nonlinear Estimation via Iterative L<formula><tex>$^{2}$</tex></formula>-Space Projections

Reproducing Kernel of Subspace

Motoya Ohnishi, Masahiro Yukawa

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

We propose a novel online learning paradigm for nonlinear-function estimation tasks based on the iterative projections in the L<formula><tex>$^{2}$</tex></formula> space with probability measure reflecting the stochastic property of input signals. The proposed learning algorithm exploits the reproducing kernel of the so-called dictionary subspace, based on the fact that any finite-dimensional space of functions has a reproducing kernel characterized by the Gram matrix. The L<formula><tex>$^{2}$</tex></formula>-space geometry provides the best decorrelation property in principle. The proposed learning paradigm is significantly different from the conventional kernel-based learning paradigm in two senses: (i) the whole space is not a reproducing kernel Hilbert space and (ii) the minimum mean squared error estimator gives the best approximation of the desired nonlinear function in the dictionary subspace. It preserves efficiency in computing the inner product as well as in updating the Gram matrix when the dictionary grows. Monotone approximation, asymptotic optimality, and convergence of the proposed algorithm are analyzed based on the variable-metric version of adaptive projected subgradient method. Numerical examples show the efficacy of the proposed algorithm for real data over a variety of methods including the extended Kalman filter and many batch machine-learning methods such as the multilayer perceptron.

Original languageEnglish
JournalIEEE Transactions on Signal Processing
DOIs
Publication statusAccepted/In press - 2018 Jun 9

Fingerprint

Glossaries
Hilbert spaces
Extended Kalman filters
Multilayer neural networks
Learning algorithms
Learning systems
Geometry

Keywords

  • kernel adaptive filter
  • L2 space
  • metric projection
  • online learning
  • recursive least squares

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

@article{8a1f781ba4ed467fafe5fa21d4fef736,
title = "Online Nonlinear Estimation via Iterative L$^{2}$-Space Projections: Reproducing Kernel of Subspace",
abstract = "We propose a novel online learning paradigm for nonlinear-function estimation tasks based on the iterative projections in the L$^{2}$ space with probability measure reflecting the stochastic property of input signals. The proposed learning algorithm exploits the reproducing kernel of the so-called dictionary subspace, based on the fact that any finite-dimensional space of functions has a reproducing kernel characterized by the Gram matrix. The L$^{2}$-space geometry provides the best decorrelation property in principle. The proposed learning paradigm is significantly different from the conventional kernel-based learning paradigm in two senses: (i) the whole space is not a reproducing kernel Hilbert space and (ii) the minimum mean squared error estimator gives the best approximation of the desired nonlinear function in the dictionary subspace. It preserves efficiency in computing the inner product as well as in updating the Gram matrix when the dictionary grows. Monotone approximation, asymptotic optimality, and convergence of the proposed algorithm are analyzed based on the variable-metric version of adaptive projected subgradient method. Numerical examples show the efficacy of the proposed algorithm for real data over a variety of methods including the extended Kalman filter and many batch machine-learning methods such as the multilayer perceptron.",
keywords = "kernel adaptive filter, L2 space, metric projection, online learning, recursive least squares",
author = "Motoya Ohnishi and Masahiro Yukawa",
year = "2018",
month = "6",
day = "9",
doi = "10.1109/TSP.2018.2846271",
language = "English",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - Online Nonlinear Estimation via Iterative L$^{2}$-Space Projections

T2 - Reproducing Kernel of Subspace

AU - Ohnishi, Motoya

AU - Yukawa, Masahiro

PY - 2018/6/9

Y1 - 2018/6/9

N2 - We propose a novel online learning paradigm for nonlinear-function estimation tasks based on the iterative projections in the L$^{2}$ space with probability measure reflecting the stochastic property of input signals. The proposed learning algorithm exploits the reproducing kernel of the so-called dictionary subspace, based on the fact that any finite-dimensional space of functions has a reproducing kernel characterized by the Gram matrix. The L$^{2}$-space geometry provides the best decorrelation property in principle. The proposed learning paradigm is significantly different from the conventional kernel-based learning paradigm in two senses: (i) the whole space is not a reproducing kernel Hilbert space and (ii) the minimum mean squared error estimator gives the best approximation of the desired nonlinear function in the dictionary subspace. It preserves efficiency in computing the inner product as well as in updating the Gram matrix when the dictionary grows. Monotone approximation, asymptotic optimality, and convergence of the proposed algorithm are analyzed based on the variable-metric version of adaptive projected subgradient method. Numerical examples show the efficacy of the proposed algorithm for real data over a variety of methods including the extended Kalman filter and many batch machine-learning methods such as the multilayer perceptron.

AB - We propose a novel online learning paradigm for nonlinear-function estimation tasks based on the iterative projections in the L$^{2}$ space with probability measure reflecting the stochastic property of input signals. The proposed learning algorithm exploits the reproducing kernel of the so-called dictionary subspace, based on the fact that any finite-dimensional space of functions has a reproducing kernel characterized by the Gram matrix. The L$^{2}$-space geometry provides the best decorrelation property in principle. The proposed learning paradigm is significantly different from the conventional kernel-based learning paradigm in two senses: (i) the whole space is not a reproducing kernel Hilbert space and (ii) the minimum mean squared error estimator gives the best approximation of the desired nonlinear function in the dictionary subspace. It preserves efficiency in computing the inner product as well as in updating the Gram matrix when the dictionary grows. Monotone approximation, asymptotic optimality, and convergence of the proposed algorithm are analyzed based on the variable-metric version of adaptive projected subgradient method. Numerical examples show the efficacy of the proposed algorithm for real data over a variety of methods including the extended Kalman filter and many batch machine-learning methods such as the multilayer perceptron.

KW - kernel adaptive filter

KW - L2 space

KW - metric projection

KW - online learning

KW - recursive least squares

UR - http://www.scopus.com/inward/record.url?scp=85048487538&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85048487538&partnerID=8YFLogxK

U2 - 10.1109/TSP.2018.2846271

DO - 10.1109/TSP.2018.2846271

M3 - Article

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

ER -