Dynamical Stability Conditions for Recurrent Neural Networks with Unsaturating Piecewise Linear Transfer Functions
- 1 August 2001
- journal article
- Published by MIT Press in Neural Computation
- Vol. 13 (8), 1811-1825
- https://doi.org/10.1162/08997660152469350
Abstract
We establish two conditions that ensure the nondivergence of additive recurrent networks with unsaturating piecewise linear transfer functions, also called linear threshold or semilinear transfer functions. As Hahn-loser, Sarpeshkar, Mahowald, Douglas, and Seung (2000) showed, networks of this type can be efficiently built in silicon and exhibit the coexistence of digital selection and analog amplification in a single circuit. To obtain this behavior, the network must be multistable and nondivergent, and our conditions allow determining the regimes where this can be achieved with maximal recurrent amplification. The first condition can be applied to nonsymmetric networks and has a simple interpretation of requiring that the strength of local inhibition match the sum over excitatory weights converging onto a neuron. The second condition is restricted to symmetric networks, but can also take into account the stabilizing effect of nonlocal inhibitory interactions. We demonstrate the application of the conditions on a simple example and the orientation-selectivity model of Ben-Yishai, Lev Bar-Or, and Sompolinsky (1995). We show that the conditions can be used to identify in their model regions of maximal orientation-selective amplification and symmetry breaking.Keywords
This publication has 20 references indexed in Scilit:
- A model for the intracortical origin of orientation preference and tuning in macaque striate cortexVisual Neuroscience, 1999
- A model for the depth-dependence of receptive field size and contrast sensitivity of cells in layer 4C of macaque striate cortexVision Research, 1998
- On the piecewise analysis of networks of linear threshold neuronsNeural Networks, 1998
- Lyapunov Functions for Neural Nets with Nondifferentiable Input-Output CharacteristicsNeural Computation, 1997
- Qualitative behaviour of some simple networksJournal of Physics A: General Physics, 1996
- On Neurodynamics with Limiter Function and Linsker's Developmental ModelNeural Computation, 1996
- Recurrent Excitation in Neocortical CircuitsScience, 1995
- Theory of orientation tuning in visual cortex.Proceedings of the National Academy of Sciences, 1995
- Equilibria of the brain-state-in-a-box (BSB) neural modelNeural Networks, 1988
- Absolute stability of global pattern formation and parallel memory storage by competitive neural networksIEEE Transactions on Systems, Man, and Cybernetics, 1983