Simple recurrent networks as generalized hidden Markov models with distributed representations

Yasubumi Sakakibara, Mostefa Golea

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

We propose simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. We devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. We present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks.

Original languageEnglish
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
PublisherIEEE
Pages979-984
Number of pages6
Volume2
Publication statusPublished - 1995
Externally publishedYes
EventProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6) - Perth, Aust
Duration: 1995 Nov 271995 Dec 1

Other

OtherProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6)
CityPerth, Aust
Period95/11/2795/12/1

Fingerprint

Hidden Markov models
Recurrent neural networks
Dynamic programming
Learning algorithms

ASJC Scopus subject areas

  • Software

Cite this

Sakakibara, Y., & Golea, M. (1995). Simple recurrent networks as generalized hidden Markov models with distributed representations. In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 2, pp. 979-984). IEEE.

Simple recurrent networks as generalized hidden Markov models with distributed representations. / Sakakibara, Yasubumi; Golea, Mostefa.

IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 IEEE, 1995. p. 979-984.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sakakibara, Y & Golea, M 1995, Simple recurrent networks as generalized hidden Markov models with distributed representations. in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 2, IEEE, pp. 979-984, Proceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6), Perth, Aust, 95/11/27.
Sakakibara Y, Golea M. Simple recurrent networks as generalized hidden Markov models with distributed representations. In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2. IEEE. 1995. p. 979-984
Sakakibara, Yasubumi ; Golea, Mostefa. / Simple recurrent networks as generalized hidden Markov models with distributed representations. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 2 IEEE, 1995. pp. 979-984
@inproceedings{f68a51cfa1ad42e7acc58130347096cd,
title = "Simple recurrent networks as generalized hidden Markov models with distributed representations",
abstract = "We propose simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. We devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. We present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks.",
author = "Yasubumi Sakakibara and Mostefa Golea",
year = "1995",
language = "English",
volume = "2",
pages = "979--984",
booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",
publisher = "IEEE",

}

TY - GEN

T1 - Simple recurrent networks as generalized hidden Markov models with distributed representations

AU - Sakakibara, Yasubumi

AU - Golea, Mostefa

PY - 1995

Y1 - 1995

N2 - We propose simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. We devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. We present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks.

AB - We propose simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. We devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. We present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks.

UR - http://www.scopus.com/inward/record.url?scp=0029488150&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0029488150&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0029488150

VL - 2

SP - 979

EP - 984

BT - IEEE International Conference on Neural Networks - Conference Proceedings

PB - IEEE

ER -