A local linearized least squares algorithm for training feedforward neural networks
- 1 March 2000
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 11 (2), 487-495
- https://doi.org/10.1109/72.839017
Abstract
In training the weights of a feedforward neural network, it is well known that the global extended Kalman filter (GEKF) algorithm has much better performance than the popular gradient descent with error backpropagation in terms of convergence and quality of solution. However, the GEKF is very computationally intensive, which has led to the development of efficient algorithms such as the multiple extended Kalman algorithm (MEKA) and the decoupled extended Kalman filter algorithm (DEKF), that are based on dimensional reduction and/or partitioning of the global problem. In this paper we present a new training algorithm, called local linearized least squares (LLLS), that is based on viewing the local system identification subproblems at the neuron level as recursive linearized least squares problems. The objective function of the least squares problems for each neuron is the sum of the squares of the linearized backpropagated error signals. The new algorithm is shown to give better convergence results for three benchmark problems in comparison to MEKA, and in comparison to DEKF for highly coupled applications. The performance of the LLLS algorithm approaches that of the GEKF algorithm in the experiments.Keywords
This publication has 11 references indexed in Scilit:
- Training feed-forward networks with the extended Kalman algorithmPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Node decoupled extended Kalman filter based learning algorithm for neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Neural network control of a four-wheel ABS modelPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Decoupled extended Kalman filter training of feedforward layered networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Linearized least-squares training of multilayer feedforward neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Extensions and enhancements of decoupled extended Kalman filter trainingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Fine Pitch Stencil Printing Process Modeling and OptimizationJournal of Electronic Packaging, 1996
- Neurocontrol of nonlinear dynamical systems with Kalman filter trained recurrent networksIEEE Transactions on Neural Networks, 1994
- Optimal filtering algorithms for fast learning in feedforward neural networksNeural Networks, 1992
- MEKA-a fast, local algorithm for training feedforward neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990