Abstract
Nonlinear neural networks are studied near saturation, when the number q of stored patterns is proportional to the system size N, i.e., q=αN. The statistical mechanics is obtained for arbitrary nonlinearity. For a wide class of models, including the original Hopfield model and clipped synapses, it is shown that there exists a critical αc above which the system looses its memory completely. Furthermore, αc never exceeds αchopfield and is determined by a universal expression. A moderate dilution of the bonds may improve the memory function.

This publication has 11 references indexed in Scilit: