Mutation-Based Genetic Neural Network
- 9 May 2005
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 16 (3), 587-600
- https://doi.org/10.1109/tnn.2005.844858
Abstract
Evolving gradient-learning artificial neural networks (ANNs) using an evolutionary algorithm (EA) is a popular approach to address the local optima and design problems of ANN. The typical approach is to combine the strength of backpropagation (BP) in weight learning and EA's capability of searching the architecture space. However, the BP's "gradient descent" approach requires a highly computer-intensive operation that relatively restricts the search coverage of EA by compelling it to use a small population size. To address this problem, we utilized mutation-based genetic neural network (MGNN) to replace BP by using the mutation strategy of local adaptation of evolutionary programming (EP) to effect weight learning. The MGNN's mutation enables the network to dynamically evolve its structure and adapt its weights at the same time. Moreover, MGNN's EP-based encoding scheme allows for a flexible and less restricted formulation of the fitness function and makes fitness computation fast and efficient. This makes it feasible to use larger population sizes and allows MGNN to have a relatively wide search coverage of the architecture space. MGNN implements a stopping criterion where overfitness occurrences are monitored through "sliding-windows" to avoid premature learning and overlearning. Statistical analysis of its performance to some well-known classification problems demonstrate its good generalization capability. It also reveals that locally adapting or scheduling the strategy parameters embedded in each individual network may provide a proper balance between the local and global searching capabilities of MGNN.Keywords
This publication has 37 references indexed in Scilit:
- Magnified Gradient Function With Deterministic Weight Modification in Adaptive LearningIEEE Transactions on Neural Networks, 2004
- Operator adaptation in evolutionary computation and its application to structure optimization of neural networksNeurocomputing, 2003
- A constructive algorithm for training cooperative neural network ensemblesIEEE Transactions on Neural Networks, 2003
- Learning polynomial feedforward neural networks by genetic programming and backpropagationIEEE Transactions on Neural Networks, 2003
- A novel training scheme for multilayered perceptrons to realize proper generalization and incremental learningIEEE Transactions on Neural Networks, 2003
- Evolving artificial neural networksProceedings of the IEEE, 1999
- A new evolutionary system for evolving artificial neural networksIEEE Transactions on Neural Networks, 1997
- No free lunch theorems for optimizationIEEE Transactions on Evolutionary Computation, 1997
- An evolutionary algorithm that constructs recurrent neural networksIEEE Transactions on Neural Networks, 1994
- Optimization by Simulated AnnealingScience, 1983