Abstract
The performance of the optimal stability of PERCEPTRON learning algorithm of Krauth and Mezard is studied for the learning of random unbiased patterns in neural networks. In the thermodynamic limit N, P, a=PN finite, a replica approach is used to find the exact distribution for the number of time steps, which is required to stabilize a pattern. Remarkably for each neuron a finite fraction of the patterns do not contribute explicitly but are stabilized by other patterns.

This publication has 15 references indexed in Scilit: