Inference of a rule by a neural network with thermal noise
- 11 June 1990
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 64 (24), 2957-2960
- https://doi.org/10.1103/physrevlett.64.2957
Abstract
Learning and generalization by a perceptron are described within a statistical-mechanical framework. In the specific case considered here, the goal of learning is to infer the properties of a reference perceptron from examples. As the number of examples is increased a transition to optimal learning at finite temperature is found: The generalization error can be decreased by adding thermal noise to the synaptic coupling parameters. Although the transition is weak, significant improvement can be achieved further beyond the threshold.Keywords
This publication has 10 references indexed in Scilit:
- Learning from Examples in a Single-Layer Neural NetworkEurophysics Letters, 1990
- Phase transitions in simple learningJournal of Physics A: General Physics, 1989
- Three unfinished works on the optimal storage capacity of networksJournal of Physics A: General Physics, 1989
- Linear and Nonlinear Extension of the Pseudo-Inverse Solution for Learning Boolean FunctionsEurophysics Letters, 1989
- Dynamics of Learning in Simple PerceptronsPhysica Scripta, 1989
- Perceptron beyond the limit of capacityJournal de Physique, 1989
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Storing and Retrieving Information in a Layered Spin SystemEurophysics Letters, 1986
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965