Efficient training procedures for adaptive kernel classifiers
- 9 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
The authors investigate two training schemes for adapting the locations and receptive field widths of the centroids in radial basis function classifiers. The adaptive kernel classifier is able to adjust the responses of the hidden units during training using an extension of the Delta rule, thus leading to improved performance and reduced network size. The rapid kernel classifier, on the other hand, uses the faster learned vector quantization algorithm to adapt the centroids. This network shows a remarkable reduction in training time with little compromise in accuracy. The performance of these two networks is evaluated using underwater acoustic transient signals.<>Keywords
This publication has 14 references indexed in Scilit:
- Universal Approximation Using Radial-Basis-Function NetworksNeural Computation, 1991
- Layered Neural Networks with Gaussian Hidden Units as Universal ApproximationsNeural Computation, 1990
- Regularization Algorithms for Learning That Are Equivalent to Multilayer NetworksScience, 1990
- Pattern classification using neural networksIEEE Communications Magazine, 1989
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989
- Self-Organization and Associative MemorySpringer Series in Information Sciences, 1989
- Adding a conscience to competitive learningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- Statistical pattern recognition with neural networks: benchmarking studiesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- Multilayer feedforward potential function networkPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- On the stationary state of Kohonen's self-organizing sensory mappingBiological Cybernetics, 1986