Backpropagation with selection—reduction of learning time and elimination of hidden units

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

This paper proposes a new backpropagation‐type learning algorithm which incorporates selection capability in the hidden‐layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden‐layer units. In the proposed algorithm, the method consists of selecting the “worst” units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected “bad” units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the “worst,” a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.

Original languageEnglish
Pages (from-to)46-54
Number of pages9
JournalSystems and Computers in Japan
Volume23
Issue number8
DOIs
Publication statusPublished - 1992

Keywords

  • Neural network
  • acceleration of convergence
  • backpropagation
  • elimination of hidden units

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Hardware and Architecture
  • Computational Theory and Mathematics

Fingerprint Dive into the research topics of 'Backpropagation with selection—reduction of learning time and elimination of hidden units'. Together they form a unique fingerprint.

  • Cite this