Dynamics of iterated-map neural networks

Abstract
We analyze a discrete-time neural network with continuous state variables updated in parallel. We show that for symmetric connections, the only attractors are fixed points and period-two-limit cycles. We also present a global stability criterion which guarantees only fixed-point attractors by placing limits on the gain (maximum slope) of the sigmoid nonlinearity. The iterated-map network has the same fixed points as a continuous-time analog electronic neural network and converges to an attractor after a small number of iterations of the map.