Abstract
Recurrent neural networks have the potential to develop internal representations that allow useful encoding of the dynamics behind a sequence of inputs. In this paper we present a model of time-varying prabability distributions by using a two-layered columnar recurrent neural network in which each hidden unit has recurrent connections from the context units representing delayed outputs of the hidden unit. The probability distribution model can provide predictions in terms of a given probabilistic density function of the context units instead of the single guess which is usually provided by Elman-type recurrent neural networks. The advantage of this approach is the interpretability between the context units and the dynamics behind the inputs. Computer simulations of a stochastic grammar and a discrete trend time-sequence are shown to demonstrate the capability of the model.
Original language | English |
---|---|
Pages (from-to) | 183-191 |
Number of pages | 9 |
Journal | Fujitsu Scientific and Technical Journal |
Volume | 32 |
Issue number | 2 |
Publication status | Published - 1996 Dec 1 |
Externally published | Yes |
ASJC Scopus subject areas
- Human-Computer Interaction
- Hardware and Architecture
- Electrical and Electronic Engineering