Recurrent neural network using mixture of experts for time series processing

Mirai Tabuse, Makoto Kinouchi, Masafumi Hagiwara

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we propose a Mixture of Experts with recurrent connections for improved time series processing. The proposed network has recurrent connections from the output layer to the context layer as the Jordan network. The context layer is expanded to a number of sublayers so that the necessary information for time series processing can be held for longer time. Most of the learning algorithms for the conventional recurrent networks are based on the Back-Propagation (BP) algorithm so that the number of epochs required for convergence tends to increase. The Mixture of Experts used in the proposed network employs a modular approach. Trained with the Expectation-Maximization (EM) algorithm, the Mixture of Experts performs very fast convergence especially in the initial steps. The proposed network can also employ the EM algorithm so that faster convergence is expected. We have examined the ability of the proposed network by some computer simulations. It is shown that the proposed network is faster than the conventional ones in point of the number of epochs required for convergence.

Original languageEnglish
Pages (from-to)536-541
Number of pages6
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Volume1
Publication statusPublished - 1997 Dec 1
EventProceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 1 (of 5) - Orlando, FL, USA
Duration: 1997 Oct 121997 Oct 15

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Recurrent neural network using mixture of experts for time series processing'. Together they form a unique fingerprint.

Cite this