Reproducing kernel hilbert C-module and kernel mean embeddings

Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi Katsura, Yoshinobu Kawahara

Research output: Contribution to journalArticlepeer-review

Abstract

Kernel methods have been among the most popular techniques in machine learning, where learning tasks are solved using the property of reproducing kernel Hilbert space (RKHS). In this paper, we propose a novel data analysis framework with reproducing kernel Hilbert C-module (RKHM) and kernel mean embedding (KME) in RKHM. Since RKHM contains richer information than RKHS or vector-valued RKHS (vvRKHS), analysis with RKHM enables us to capture and extract structural properties in such as functional data. We show a branch of theories for RKHM to apply to data analysis, including the representer theorem, and the injectivity and universality of the proposed KME. We also show RKHM generalizes RKHS and vvRKHS. Then, we provide concrete procedures for employing RKHM and the proposed KME to data analysis.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume22
Publication statusPublished - 2021

Keywords

  • Interaction effects
  • Kernel PCA
  • Kernel mean embedding
  • Reproducing kernel Hilbert C-module
  • Structured data

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Reproducing kernel hilbert C-module and kernel mean embeddings'. Together they form a unique fingerprint.

Cite this