A HIERARCHICAL METHOD FOR FINDING OPTIMAL ARCHITECTURE AND WEIGHTS USING EVOLUTIONARY LEAST SQUARE BASED LEARNING
- 1 February 2003
- journal article
- research article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Neural Systems
- Vol. 13 (1), 13-24
- https://doi.org/10.1142/s0129065703001364
Abstract
In this paper, we present a novel approach of implementing a combination methodology to find appropriate neural network architecture and weights using an evolutionary least square based algorithm (GALS).1 This paper focuses on aspects such as the heuristics of updating weights using an evolutionary least square based algorithm, finding the number of hidden neurons for a two layer feed forward neural network, the stopping criterion for the algorithm and finally some comparisons of the results with other existing methods for searching optimal or near optimal solution in the multidimensional complex search space comprising the architecture and the weight variables. We explain how the weight updating algorithm using evolutionary least square based approach can be combined with the growing architecture model to find the optimum number of hidden neurons. We also discuss the issues of finding a probabilistic solution space as a starting point for the least square method and address the problems involving fitness breaking. We apply the proposed approach to XOR problem, 10 bit odd parity problem and many real-world benchmark data sets such as handwriting data set from CEDAR, breast cancer and heart disease data sets from UCI ML repository. The comparative results based on classification accuracy and the time complexity are discussed.Keywords
This publication has 8 references indexed in Scilit:
- Evolutionary artificial neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Two regularizers for recursive least squared algorithms in feedforward multilayered neural networksIEEE Transactions on Neural Networks, 2001
- A local linearized least squares algorithm for training feedforward neural networksIEEE Transactions on Neural Networks, 2000
- Making use of population information in evolutionary artificial neural networksIEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 1998
- Neural network training by means of cooperative evolutionary searchNuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 1997
- Progress in supervised neural networksIEEE Signal Processing Magazine, 1993
- A scaled conjugate gradient algorithm for fast supervised learningNeural Networks, 1993
- Genetic algorithms and neural networks: optimizing connections and connectivityParallel Computing, 1990