Absolute stability conditions for discrete-time recurrent neural networks
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (6), 954-964
- https://doi.org/10.1109/72.329693
Abstract
An analysis of the absolute stability for a general class of discrete-time recurrent neural networks (RNN's) is presented. A discrete-time model of RNN's is represented by a set of nonlinear difference equations. Some sufficient conditions for the absolute stability are derived using Ostrowski's theorem and the similarity transformation approach. For a given RNN model, these conditions are determined by the synaptic weight matrix of the network. The results reported in this paper need fewer constraints on the weight matrix and the model than in previously published studies.Keywords
This publication has 21 references indexed in Scilit:
- An analog feedback associative memoryIEEE Transactions on Neural Networks, 1993
- Discrete-time versus continuous-time models of neural networksJournal of Computer and System Sciences, 1992
- Stability of fixed points and periodic orbits and bifurcations in analog neural networksNeural Networks, 1992
- Dynamical analysis of the brain-state-in-a-box (BSB) neural modelsIEEE Transactions on Neural Networks, 1992
- Analysis and synthesis of a class of discrete-time neural networks described on hypercubesIEEE Transactions on Neural Networks, 1991
- Equilibrium characterization of dynamical neural networks and a systematic synthesis procedure for associative memoriesIEEE Transactions on Neural Networks, 1991
- Associative memory in an analog iterated-map neural networkPhysical Review A, 1990
- Dynamics of iterated-map neural networksPhysical Review A, 1989
- Stability of analog neural networks with delayPhysical Review A, 1989
- On the stability, storage capacity, and design of nonlinear continuous neural networksIEEE Transactions on Systems, Man, and Cybernetics, 1988