Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution
- 1 May 2002
- journal article
- Published by MIT Press in Neural Computation
- Vol. 14 (5), 1183-1194
- https://doi.org/10.1162/089976602753633448
Abstract
This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.Keywords
This publication has 6 references indexed in Scilit:
- Robust maximum likelihood training of heteroscedastic probabilistic neural networksNeural Networks, 1998
- Constructive training of probabilistic neural networksNeurocomputing, 1998
- On the generalization ability of neural network classifiersIEEE Transactions on Pattern Analysis and Machine Intelligence, 1994
- Probabilistic neural networksNeural Networks, 1990
- Probabilistic neural networks for classification, mapping, or associative memoryPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- On Estimation of a Probability Density Function and ModeThe Annals of Mathematical Statistics, 1962