Generalization in a two-layer neural network
- 1 December 1993
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 48 (6), 4805-4809
- https://doi.org/10.1103/physreve.48.4805
Abstract
Generalization in a fully connected two-layer neural network with N input nodes, M hidden nodes, a single output node, and binary weights is studied in the annealed approximation. When the number of examples is the order of N, the generalization error approaches a plateau and the system is in a permutation symmetric phase. When the number of examples is of the order of MN, the system undergoes a first-order phase transition to perfect generalization and the permutation symmetry breaks. Results of the computer simulation show good agreement with analytic calculation.Keywords
This publication has 19 references indexed in Scilit:
- Stability of the replica-symmetric solution for a perceptron learning from examplesPhysical Review E, 1993
- Four Types of Learning CurvesNeural Computation, 1992
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Learning from examples in large neural networksPhysical Review Letters, 1990
- Inference of a rule by a neural network with thermal noisePhysical Review Letters, 1990
- First-order transition to perfect generalization in a neural network with binary synapsesPhysical Review A, 1990
- Three unfinished works on the optimal storage capacity of networksJournal of Physics A: General Physics, 1989
- Storage capacity of memory networks with binary couplingsJournal de Physique, 1989
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987