Recurrent neural network using mixture of experts for time series processing

Mirai Tabuse, Makoto Kinouchi, Masafumi Hagiwara

研究成果: Conference article査読

抄録

In this paper, we propose a Mixture of Experts with recurrent connections for improved time series processing. The proposed network has recurrent connections from the output layer to the context layer as the Jordan network. The context layer is expanded to a number of sublayers so that the necessary information for time series processing can be held for longer time. Most of the learning algorithms for the conventional recurrent networks are based on the Back-Propagation (BP) algorithm so that the number of epochs required for convergence tends to increase. The Mixture of Experts used in the proposed network employs a modular approach. Trained with the Expectation-Maximization (EM) algorithm, the Mixture of Experts performs very fast convergence especially in the initial steps. The proposed network can also employ the EM algorithm so that faster convergence is expected. We have examined the ability of the proposed network by some computer simulations. It is shown that the proposed network is faster than the conventional ones in point of the number of epochs required for convergence.

本文言語English
ページ(範囲)536-541
ページ数6
ジャーナルProceedings of the IEEE International Conference on Systems, Man and Cybernetics
1
出版ステータスPublished - 1997 12 1
イベントProceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 1 (of 5) - Orlando, FL, USA
継続期間: 1997 10 121997 10 15

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Hardware and Architecture

フィンガープリント 「Recurrent neural network using mixture of experts for time series processing」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル