Learning from examples in large neural networks
- 24 September 1990
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 65 (13), 1683-1686
- https://doi.org/10.1103/PhysRevLett.65.1683
Abstract
A statistical-mechanical theory of learning from examples in layered networks at finite temperature is studied. When the training error is a smooth function of continuously varying weights, the generalization error falls off asymptotically as the inverse number of examples. By analytical and numerical studies of single-layer perceptrons, we show that when the weights are discrete, the generalization error can exhibit a discontinuous transition to perfect generalization. For intermediate sizes of the example set, the state of perfect generalization coexists with a metastable spin-glass state.Keywords
This publication has 8 references indexed in Scilit:
- First-order transition to perfect generalization in a neural network with binary synapsesPhysical Review A, 1990
- Learning from Examples in a Single-Layer Neural NetworkEurophysics Letters, 1990
- Three unfinished works on the optimal storage capacity of networksJournal of Physics A: General Physics, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Storage capacity of memory networks with binary couplingsJournal de Physique, 1989
- Exhaustive Thermodynamical Analysis of Boolean Learning NetworksEurophysics Letters, 1987
- A theory of the learnableCommunications of the ACM, 1984
- Deductive learningPhilosophical Transactions of the Royal Society of London. Series A, Mathematical and Physical Sciences, 1984