Dynamics of Learning in Simple Perceptrons

Abstract
We examine the statistical dynamics of learning in a single-layer network in the presence of noise using a Langevin model. The learning scheme is linear (the delta rule) and we add a chemical potential term to constrain the size of the couplings. For random uncorrelated input patterns the average relaxation time is calculated by linear response theory and we find the critical storage capacity. It turns out that noise reduces the learning time. Without the constraint noise has no effect on the relaxation time. Finally we study the case of unsupervised Hebbian learning and find that the relaxation time increases with noise.