Columnar recurrent neural network and time series analysis

Masahiro Matsuoka, Mostefa Golea, Yasubumi Sakakibara

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

Original languageEnglish
Pages (from-to)183-191
Number of pages9
JournalFujitsu Scientific and Technical Journal
Issue number2
Publication statusPublished - 1996 Dec 1
Externally publishedYes

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Hardware and Architecture
  • Electrical and Electronic Engineering


Dive into the research topics of 'Columnar recurrent neural network and time series analysis'. Together they form a unique fingerprint.

Cite this