Columnar recurrent neural network and time series analysis

Masahiro Matsuoka, Mostefa Golea, Yasubumi Sakakibara

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

Original languageEnglish
Pages (from-to)183-191
Number of pages9
JournalFujitsu Scientific and Technical Journal
Volume32
Issue number2
Publication statusPublished - 1996
Externally publishedYes

Fingerprint

Time series analysis
Recurrent neural networks
Probability distributions
Probability density function
Computer simulation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

Columnar recurrent neural network and time series analysis. / Matsuoka, Masahiro; Golea, Mostefa; Sakakibara, Yasubumi.

In: Fujitsu Scientific and Technical Journal, Vol. 32, No. 2, 1996, p. 183-191.

Research output: Contribution to journalArticle

Matsuoka, Masahiro ; Golea, Mostefa ; Sakakibara, Yasubumi. / Columnar recurrent neural network and time series analysis. In: Fujitsu Scientific and Technical Journal. 1996 ; Vol. 32, No. 2. pp. 183-191.
@article{b411581a1b2d4f46bcf28ffb60485f82,
title = "Columnar recurrent neural network and time series analysis",
abstract = "Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.",
author = "Masahiro Matsuoka and Mostefa Golea and Yasubumi Sakakibara",
year = "1996",
language = "English",
volume = "32",
pages = "183--191",
journal = "Fujitsu Scientific and Technical Journal",
issn = "0016-2523",
publisher = "Fujitsu Ltd",
number = "2",

}

TY - JOUR

T1 - Columnar recurrent neural network and time series analysis

AU - Matsuoka, Masahiro

AU - Golea, Mostefa

AU - Sakakibara, Yasubumi

PY - 1996

Y1 - 1996

N2 - Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

AB - Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

UR - http://www.scopus.com/inward/record.url?scp=0030398085&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030398085&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0030398085

VL - 32

SP - 183

EP - 191

JO - Fujitsu Scientific and Technical Journal

JF - Fujitsu Scientific and Technical Journal

SN - 0016-2523

IS - 2

ER -