Backpropagation with selection-reduction of learning time and elimination of hidden units

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

This paper proposes a new backpropagation-type learning algorithm which incorporates selection capability in the hidden-layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden-layer units. In the proposed algorithm, the method consists of selecting the ``worst'' units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected ``bad'' units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the ``worst,'' a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.

Original languageEnglish
Pages (from-to)46-54
Number of pages9
JournalSystems and Computers in Japan
Volume23
Issue number8
Publication statusPublished - 1992

Fingerprint

Back Propagation
Backpropagation
Elimination
Unit
Learning algorithms
Computer simulation
Learning
Costs
Local Minima
Learning Algorithm
Computer Simulation
Cycle

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Hardware and Architecture
  • Information Systems
  • Theoretical Computer Science

Cite this

Backpropagation with selection-reduction of learning time and elimination of hidden units. / Hagiwara, Masafumi.

In: Systems and Computers in Japan, Vol. 23, No. 8, 1992, p. 46-54.

Research output: Contribution to journalArticle

@article{e930500a02ef49f1bf322a66284661b8,
title = "Backpropagation with selection-reduction of learning time and elimination of hidden units",
abstract = "This paper proposes a new backpropagation-type learning algorithm which incorporates selection capability in the hidden-layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden-layer units. In the proposed algorithm, the method consists of selecting the ``worst'' units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected ``bad'' units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the ``worst,'' a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.",
author = "Masafumi Hagiwara",
year = "1992",
language = "English",
volume = "23",
pages = "46--54",
journal = "Systems and Computers in Japan",
issn = "0882-1666",
publisher = "John Wiley and Sons Inc.",
number = "8",

}

TY - JOUR

T1 - Backpropagation with selection-reduction of learning time and elimination of hidden units

AU - Hagiwara, Masafumi

PY - 1992

Y1 - 1992

N2 - This paper proposes a new backpropagation-type learning algorithm which incorporates selection capability in the hidden-layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden-layer units. In the proposed algorithm, the method consists of selecting the ``worst'' units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected ``bad'' units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the ``worst,'' a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.

AB - This paper proposes a new backpropagation-type learning algorithm which incorporates selection capability in the hidden-layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden-layer units. In the proposed algorithm, the method consists of selecting the ``worst'' units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected ``bad'' units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the ``worst,'' a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.

UR - http://www.scopus.com/inward/record.url?scp=0027086339&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027086339&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0027086339

VL - 23

SP - 46

EP - 54

JO - Systems and Computers in Japan

JF - Systems and Computers in Japan

SN - 0882-1666

IS - 8

ER -