Training a One-Dimensional Classifier to Minimize the Probability of Error

Abstract
Some of the results of a study of asymptotically optimum nonparametric training procedures for two-category pattern classifiers are reported. The decision surfaces yielded by earlier forms of nonparametric training procedures generally do not minimize the probability of error. We derive a modification of the Robbins-Monro method of stochastic approximation, and show how this modification leads to training procedures that minimize the probability of error of a one-dimensional two-category pattern classifier. The class of probability density functions admitted by these training procedures is quite broad. We show that the sequence of decision points generated by any of these training procedures converges with probability one to the minimum-probability-of-error decision point.

This publication has 11 references indexed in Scilit: