Theoretical Comparison of a Class of Feature Selection Criteria in Pattern Recognition
- 1 September 1971
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Computers
- Vol. C-20 (9), 1054-1056
- https://doi.org/10.1109/t-c.1971.223402
Abstract
The distance measures and the information functions for feature selection are compared. The comparison is based on the available tight upper and lower bounds of the probability of misrecognition, the rates of change of such probability, the effectiveness of a feature subset, and the computational complexity.Keywords
This publication has 8 references indexed in Scilit:
- Probability of error, equivocation, and the Chernoff boundIEEE Transactions on Information Theory, 1970
- Feature Selection in Pattern RecognitionIEEE Transactions on Systems Science and Cybernetics, 1970
- A class of upper bounds on probability of error for multihypotheses pattern recognition (Corresp.)IEEE Transactions on Information Theory, 1969
- On the best finite set of linear observables for discriminating two Gaussian signalsIEEE Transactions on Information Theory, 1967
- The Divergence and Bhattacharyya Distance Measures in Signal SelectionIEEE Transactions on Communications, 1967
- Inequalities between information measures and error probabilityJournal of the Franklin Institute, 1966
- A simple derivation of the coding theorem and some applicationsIEEE Transactions on Information Theory, 1965
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and EstimationThe Annals of Mathematical Statistics, 1955