Exploring latent structure of mixture ICA models by the minimum β-divergence method

Md Nurul Haque Mollah, Mihoko Minami, Shinto Eguchi

Research output: Contribution to journalArticle

24 Citations (Scopus)

Abstract

Independent component analysis (ICA) attempts to extract original independent signals (source components) that are linearly mixed in a basic framework. This letter discusses a learning algorithm for the separation of different source classes in which the observed data follow a mixture of several ICA models, where each model is described by a linear combination of independent and nongaussian sources. The proposed method is based on a sequential application of the minimum β-divergence method to separate all source classes sequentially. The proposed method searches the recovering matrix of each class on the basis of a rule of sequential change of the shifting parameter. If the initial choice of the shifting parameter vector is close to the mean of a data class, then all of the hidden sources belonging to that class are recovered properly with independent and nongaussian structure considering the data in other classes as out-liers. The value of the tuning parameter β is a key in the performance of the proposed method. A cross-validation technique is proposed as an adaptive selection procedure for the tuning parameter β for this algorithm, together with applications for both real and synthetic data analysis.

Original languageEnglish
Pages (from-to)166-190
Number of pages25
JournalNeural Computation
Volume18
Issue number1
DOIs
Publication statusPublished - 2006 Jan
Externally publishedYes

Fingerprint

Independent component analysis
Tuning
Learning algorithms
Divergence
Independent Component Analysis
Learning

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Exploring latent structure of mixture ICA models by the minimum β-divergence method. / Mollah, Md Nurul Haque; Minami, Mihoko; Eguchi, Shinto.

In: Neural Computation, Vol. 18, No. 1, 01.2006, p. 166-190.

Research output: Contribution to journalArticle

Mollah, Md Nurul Haque ; Minami, Mihoko ; Eguchi, Shinto. / Exploring latent structure of mixture ICA models by the minimum β-divergence method. In: Neural Computation. 2006 ; Vol. 18, No. 1. pp. 166-190.
@article{86118dafab2a488d9b2a08dd65cab335,
title = "Exploring latent structure of mixture ICA models by the minimum β-divergence method",
abstract = "Independent component analysis (ICA) attempts to extract original independent signals (source components) that are linearly mixed in a basic framework. This letter discusses a learning algorithm for the separation of different source classes in which the observed data follow a mixture of several ICA models, where each model is described by a linear combination of independent and nongaussian sources. The proposed method is based on a sequential application of the minimum β-divergence method to separate all source classes sequentially. The proposed method searches the recovering matrix of each class on the basis of a rule of sequential change of the shifting parameter. If the initial choice of the shifting parameter vector is close to the mean of a data class, then all of the hidden sources belonging to that class are recovered properly with independent and nongaussian structure considering the data in other classes as out-liers. The value of the tuning parameter β is a key in the performance of the proposed method. A cross-validation technique is proposed as an adaptive selection procedure for the tuning parameter β for this algorithm, together with applications for both real and synthetic data analysis.",
author = "Mollah, {Md Nurul Haque} and Mihoko Minami and Shinto Eguchi",
year = "2006",
month = "1",
doi = "10.1162/089976606774841549",
language = "English",
volume = "18",
pages = "166--190",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "1",

}

TY - JOUR

T1 - Exploring latent structure of mixture ICA models by the minimum β-divergence method

AU - Mollah, Md Nurul Haque

AU - Minami, Mihoko

AU - Eguchi, Shinto

PY - 2006/1

Y1 - 2006/1

N2 - Independent component analysis (ICA) attempts to extract original independent signals (source components) that are linearly mixed in a basic framework. This letter discusses a learning algorithm for the separation of different source classes in which the observed data follow a mixture of several ICA models, where each model is described by a linear combination of independent and nongaussian sources. The proposed method is based on a sequential application of the minimum β-divergence method to separate all source classes sequentially. The proposed method searches the recovering matrix of each class on the basis of a rule of sequential change of the shifting parameter. If the initial choice of the shifting parameter vector is close to the mean of a data class, then all of the hidden sources belonging to that class are recovered properly with independent and nongaussian structure considering the data in other classes as out-liers. The value of the tuning parameter β is a key in the performance of the proposed method. A cross-validation technique is proposed as an adaptive selection procedure for the tuning parameter β for this algorithm, together with applications for both real and synthetic data analysis.

AB - Independent component analysis (ICA) attempts to extract original independent signals (source components) that are linearly mixed in a basic framework. This letter discusses a learning algorithm for the separation of different source classes in which the observed data follow a mixture of several ICA models, where each model is described by a linear combination of independent and nongaussian sources. The proposed method is based on a sequential application of the minimum β-divergence method to separate all source classes sequentially. The proposed method searches the recovering matrix of each class on the basis of a rule of sequential change of the shifting parameter. If the initial choice of the shifting parameter vector is close to the mean of a data class, then all of the hidden sources belonging to that class are recovered properly with independent and nongaussian structure considering the data in other classes as out-liers. The value of the tuning parameter β is a key in the performance of the proposed method. A cross-validation technique is proposed as an adaptive selection procedure for the tuning parameter β for this algorithm, together with applications for both real and synthetic data analysis.

UR - http://www.scopus.com/inward/record.url?scp=33645716208&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33645716208&partnerID=8YFLogxK

U2 - 10.1162/089976606774841549

DO - 10.1162/089976606774841549

M3 - Article

AN - SCOPUS:33645716208

VL - 18

SP - 166

EP - 190

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 1

ER -