An analog VLSI recurrent neural network learning a continuous-time trajectory
- 1 March 1996
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 7 (2), 346-361
- https://doi.org/10.1109/72.485671
Abstract
Real-time algorithms for gradient descent supervised learning in recurrent dynamical neural networks fail to support scalable VLSI implementation, due to their complexity which grows sharply with the network dimension. We present an alternative implementation in analog VLSI, which employs a stochastic perturbation algorithm to observe the gradient of the error index directly on the network in random directions of the parameter space, thereby avoiding the tedious task of deriving the gradient from an explicit model of the network dynamics. The network contains six fully recurrent neurons with continuous-time dynamics, providing 42 free parameters which comprise connection strengths and thresholds. The chip implementing the network includes local provisions supporting both the learning and storage of the parameters, integrated in a scalable architecture which can be readily expanded for applications of learning recurrent dynamical networks requiring larger dimensionality. We describe and characterize the functional elements comprising the implemented recurrent network and integrated learning system, and include experimental results obtained from training the network to represent a quadrature-phase oscillator.Keywords
This publication has 37 references indexed in Scilit:
- A CMOS analog adaptive BAM with on-chip learning and weight refreshingIEEE Transactions on Neural Networks, 1993
- Improved implementation of the silicon cochleaIEEE Journal of Solid-State Circuits, 1992
- Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networksIEEE Transactions on Neural Networks, 1992
- Current-mode subthreshold MOS circuits for analog VLSI neural systemsIEEE Transactions on Neural Networks, 1991
- An analog integrated neural network capable of learning the Feigenbaum logistic mapIEEE Transactions on Circuits and Systems, 1990
- A high-swing, high-impedance MOS cascode circuitIEEE Journal of Solid-State Circuits, 1990
- Experiments in nonconvex optimization: Stochastic approximation with function smoothing and simulated annealingNeural Networks, 1990
- A Learning Algorithm for Continually Running Fully Recurrent Neural NetworksNeural Computation, 1989
- Learning State Space Trajectories in Recurrent Neural NetworksNeural Computation, 1989
- A Stochastic Approximation MethodThe Annals of Mathematical Statistics, 1951