On absolute stability of neural networks

Abstract
The aim of this paper is to discuss the role of Absolute Stability (ABST) in the design of neural optimization solvers and to find necessary and sufficient conditions for ABST for some classes of neural networks of applicative interest. By ABST it is meant that there is a unique equilibrium point attracting all trajectories of motion and that this property is valid for all neuron activation functions belonging to a specified class of nonlinear mappings and for all constant neural network inputs. ABST neural networks are best suited for solving optimization problems being devoid of spurious suboptimal responses for every choice of the activation function and of the input vector. A necessary and sufficient condition for ABST has been found for symmetric neural networks of the Hopfield type. In this paper, we show that the concept of ABST can be applied also to special classes of nonsymmetric Hopfield neural networks and to neural models different from the Hopfield one. It is shown in particular that necessary and sufficient conditions for ABST can be found for two interesting classes of nonsymmetric networks, namely, cooperative Hopfield-type networks and composite neural networks with variable and constraint neurons used for solving linear and quadratic programming problems in real time Author(s) Forti, M. Dept. of Electron. Eng., Florence Univ., Italy Liberatore, A. ; Manetti, S. ; Marini, M.

This publication has 15 references indexed in Scilit: