Abstract
The decrease in the detectability of a gated sinusoidal signal in noise caused by deliberately introducing uncertainty about the signal's frequency is no greater than 3 db, even in an extreme condition of uncertainty. In this extreme condition the signal duration is 0.1 sec, and the signal frequency is varied between 500 and 4000 cps. This effect is not critically dependent on signal duration. Moreover, the observers not only detect the signal but display at least gross information about the frequency of the signal in the uncertain frequency conditions. Several models, suggested in previous studies, are considered. The magnitude of the decrease observed in the data falls far short of the predictions of these models. An interpretation suggested by the data is that the observers in a detection task, even when a signal of fixed frequency is used, are highly uncertain as to the exact physical parameters of the signal. Another way of stating this assumption is to say that the observer never tests for the presence or absence of a signal on the basis of one simple hypothesis. From this assumption we should expect little decrease in detectability from deliberately introducing frequency uncertainty. This interpretation would suggest the same result would be obtained if time were the major experimental variable.