Abstract
Learning from examples by a perceptron with binary synaptic parameters is studied. The examples are given by a reference (teacher) perceptron. It is shown that as the number of examples increases, the network undergoes a first-order transition, where it freezes into the state of the reference perceptron. When the transition point is approached from below, the generalization error reaches a minimal positive value, while above that point the error is constantly zero. The transition is found to occur at αGD=1.245 examples per coupling [E. Gardner and B. Derrida, J. Phys. A 22, 1983 (1989)].

This publication has 13 references indexed in Scilit: