Nonlinear neural networks near saturation
- 1 August 1987
- journal article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 36 (4), 1959-1962
- https://doi.org/10.1103/physreva.36.1959
Abstract
Nonlinear neural networks are studied near saturation, when the number of stored patterns is proportional to the system size , i.e., . The statistical mechanics is obtained for arbitrary nonlinearity. For a wide class of models, including the original Hopfield model and clipped synapses, it is shown that there exists a critical above which the system looses its memory completely. Furthermore, never exceeds and is determined by a universal expression. A moderate dilution of the bonds may improve the memory function.
Keywords
This publication has 11 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationAnnals of Physics, 1987
- Neural networks with nonlinear synapses and a static noisePhysical Review A, 1986
- Nonlinear Neural NetworksPhysical Review Letters, 1986
- Spin glass model of learning by selection.Proceedings of the National Academy of Sciences, 1986
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Collective properties of neural networks: A statistical physics approachBiological Cybernetics, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- The replica method and solvable spin glass modelJournal of Physics A: General Physics, 1979
- The existence of persistent states in the brainMathematical Biosciences, 1974