Learning in multilayered networks used as autoassociators
- 1 March 1995
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 6 (2), 512-515
- https://doi.org/10.1109/72.363492
Abstract
Gradient descent learning algorithms may get stuck in local minima, thus making the learning suboptimal. In this paper, we focus attention on multilayered networks used as autoassociators and show some relationships with classical linear autoassociators. In addition, using the theoretical framework of our previous research, we derive a condition which is met at the end of the learning process and show that this condition has a very intriguing geometrical meaning in the pattern spaceKeywords
This publication has 4 references indexed in Scilit:
- Can backpropagation error surface not have local minimaIEEE Transactions on Neural Networks, 1992
- On the problem of local minima in backpropagationIEEE Transactions on Pattern Analysis and Machine Intelligence, 1992
- Neural networks and principal component analysis: Learning from examples without local minimaNeural Networks, 1989
- Auto-association by multilayer perceptrons and singular value decompositionBiological Cybernetics, 1988