A simple procedure for pruning back-propagation trained neural networks
- 1 June 1990
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 1 (2), 239-242
- https://doi.org/10.1109/72.80236
Abstract
The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned by discarding the last items of the sorted list. Unlike previous approaches, this simple procedure does not require a modification of the cost function, does not interfere with the learning process, and demands a negligible computational overhead.Keywords
This publication has 1 reference indexed in Scilit:
- Neural net pruning-why and howPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988