### Abstract

Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

Original language | English |
---|---|

Pages (from-to) | 183-191 |

Number of pages | 9 |

Journal | Fujitsu Scientific and Technical Journal |

Volume | 32 |

Issue number | 2 |

Publication status | Published - 1996 |

Externally published | Yes |

### Fingerprint

### ASJC Scopus subject areas

- Electrical and Electronic Engineering

### Cite this

*Fujitsu Scientific and Technical Journal*,

*32*(2), 183-191.

**Columnar recurrent neural network and time series analysis.** / Matsuoka, Masahiro; Golea, Mostefa; Sakakibara, Yasubumi.

Research output: Contribution to journal › Article

*Fujitsu Scientific and Technical Journal*, vol. 32, no. 2, pp. 183-191.

}

TY - JOUR

T1 - Columnar recurrent neural network and time series analysis

AU - Matsuoka, Masahiro

AU - Golea, Mostefa

AU - Sakakibara, Yasubumi

PY - 1996

Y1 - 1996

N2 - Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

AB - Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.

UR - http://www.scopus.com/inward/record.url?scp=0030398085&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030398085&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0030398085

VL - 32

SP - 183

EP - 191

JO - Fujitsu Scientific and Technical Journal

JF - Fujitsu Scientific and Technical Journal

SN - 0016-2523

IS - 2

ER -