Training feedforward networks with the Marquardt algorithm
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (6), 989-993
- https://doi.org/10.1109/72.329697
Abstract
The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks. The algorithm is tested on several function approximation problems, and is compared with a conjugate gradient algorithm and a variable learning rate algorithm. It is found that the Marquardt algorithm is much more efficient than either of the other techniques when the network contains no more than a few hundred weights.Keywords
This publication has 12 references indexed in Scilit:
- Decoupled extended Kalman filter training of feedforward layered networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's MethodNeural Computation, 1992
- Optimization for training neural netsIEEE Transactions on Neural Networks, 1992
- SuperSAB: Fast adaptive back propagation with good scaling propertiesNeural Networks, 1990
- Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weightsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990
- On the limited memory BFGS method for large scale optimizationMathematical Programming, 1989
- An adaptive least squares algorithm for the efficient training of artificial neural networksIEEE Transactions on Circuits and Systems, 1989
- Accelerating the convergence of the back-propagation methodBiological Cybernetics, 1988
- Increased rates of convergence through learning rate adaptationNeural Networks, 1988
- An Algorithm for Least-Squares Estimation of Nonlinear ParametersJournal of the Society for Industrial and Applied Mathematics, 1963