Abstract
A novel approach is presented for the training of multilayer feedforward neural networks, using a conjugate gradient algorithm incorporating an appropriate line search algorithm. The algorithm updates the input weights to each neuron in an efficient parallel way, similar to the one used by the well known backpropagation algorithm. The performance of the algorithm is superior to that of the conventional backpropagation algorithm and is based on strong theoretical reasons supported by the numerical results of three examples.

This publication has 1 reference indexed in Scilit: