Forgetful Memories

Abstract
Iterative learning schemes for fully connected neural nets are solved analytically. The key to the solution is a Markov chain representation of the nonlinear, iterative, encoding procedure, whose asymptotics can be determined exactly. Forgetting is an intrinsic property of the network, induced by a first-order transition of the retrieval quality as a function of the storage ancestry. The storage capacity, which can be obtained analytically, is extensive. Numerical simulations are found to be in excellent agreement with our results.

This publication has 13 references indexed in Scilit: