Learning times of neural networks: Exact solution for a PERCEPTRON algorithm
- 1 October 1988
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 38 (7), 3824-3826
- https://doi.org/10.1103/physreva.38.3824
Abstract
The performance of the optimal stability of PERCEPTRON learning algorithm of Krauth and Mezard is studied for the learning of random unbiased patterns in neural networks. In the thermodynamic limit , , finite, a replica approach is used to find the exact distribution for the number of time steps, which is required to stabilize a pattern. Remarkably for each neuron a finite fraction of the patterns do not contribute explicitly but are stabilized by other patterns.
Keywords
This publication has 15 references indexed in Scilit:
- Content-addressability and learning in neural networksJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Dynamical Learning Process for Recognition of Correlated Patterns in Symmetric Spin Glass ModelsEurophysics Letters, 1987
- Learning algorithms with optimal stability in neural networksJournal of Physics A: General Physics, 1987
- Exact solution of a layered neural network modelPhysical Review Letters, 1987
- An Exactly Solvable Asymmetric Neural Network ModelEurophysics Letters, 1987
- Learning of correlated patterns in spin-glass networks by local learning rulesPhysical Review Letters, 1987
- Associative recall of memory without errorsPhysical Review A, 1987
- Learning and pattern recognition in spin glass modelsZeitschrift für Physik B Condensed Matter, 1985
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982