This paper proposes a new backpropagation‐type learning algorithm which incorporates selection capability in the hidden‐layer units. This algorithm is simple and is effective in reducing both the number of training cycles required and the number of hidden‐layer units. In the proposed algorithm, the method consists of selecting the “worst” units from among those in the hidden layer and eliminating them. Before convergence is achieved, by resetting the connection weights of the selected “bad” units to small random values, an escape from a local minimum is effected and learning time is shortened. When the network is converging, by preferentially deleting units starting with the “worst,” a reduction in the number of units on the hidden layer is achieved. The reduction of the number of units on the hidden layer increases the generalization capability of the network and contributes to a reduction in the computation costs and the like. Through a computer simulation, the superior performance of the proposed algorithm is demonstrated.
ASJC Scopus subject areas