Abstract
Global stability is examined for nonlinear feedback dynamical systems subject to unsupervised learning. Only differentiable neural models are discussed. The unconditional stability of Hebbian learning systems is summarized in the adaptive bidirectional associative memory (ABAM) theorem. When no learning occurs, the resulting BAM models include Cohen-Grossberg autoassociators, Hopfield circuits, brain-state-in-a box models, and masking field models. The ABAM theorem is extended to arbitrary higher-order Hebbian learning. Conditions for exponential convergence are discussed. Sufficient conditions for global stability are established for dynamical systems that adapt according to competitive and differential Hebbian learning laws.