First-order transition to perfect generalization in a neural network with binary synapses
- 1 June 1990
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 41 (12), 7097-7100
- https://doi.org/10.1103/physreva.41.7097
Abstract
Learning from examples by a perceptron with binary synaptic parameters is studied. The examples are given by a reference (teacher) perceptron. It is shown that as the number of examples increases, the network undergoes a first-order transition, where it freezes into the state of the reference perceptron. When the transition point is approached from below, the generalization error reaches a minimal positive value, while above that point the error is constantly zero. The transition is found to occur at =1.245 examples per coupling [E. Gardner and B. Derrida, J. Phys. A 22, 1983 (1989)].
Keywords
This publication has 13 references indexed in Scilit:
- Inference of a rule by a neural network with thermal noisePhysical Review Letters, 1990
- Learning from Examples in a Single-Layer Neural NetworkEurophysics Letters, 1990
- Mapping correlated Gaussian patterns in a perceptronJournal of Physics A: General Physics, 1989
- Three unfinished works on the optimal storage capacity of networksJournal of Physics A: General Physics, 1989
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Layered feed-forward neural network with exactly soluble dynamicsPhysical Review A, 1988
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987
- Exact solution of a layered neural network modelPhysical Review Letters, 1987
- Storing and Retrieving Information in a Layered Spin SystemEurophysics Letters, 1986