A simple method to derive bounds on the size and to train multilayer neural networks
- 1 July 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (4), 467-471
- https://doi.org/10.1109/72.88168
Abstract
A new derivation is presented for the bounds on the size of a multilayer neural network to exactly implement an arbitrary training set; namely the training set can be implemented with zero error with two layers and with the number of the hidden-layer neurons equal to #1>/= p-1. The derivation does not require the separation of the input space by particular hyperplanes, as in previous derivations. The weights for the hidden layer can be chosen almost arbitrarily, and the weights for the output layer can be found by solving #1+1 linear equations. The method presented exactly solves (M), the multilayer neural network training problem, for any arbitrary training set.Keywords
This publication has 2 references indexed in Scilit:
- Bounds on the number of hidden neurons in multilayer perceptronsIEEE Transactions on Neural Networks, 1991
- On the capabilities of multilayer perceptronsJournal of Complexity, 1988