Abstract
The purpose of this study was to assess the effects of computer management of pupil performance data bases and alternative methods for evaluating those data bases. Subjects included 18 teachers who were assigned randomly to either a computer or noncomputer group and to either a goal-based or an experimental data evaluation method. Each teacher selected two mildly handicapped pupils for participation and implemented ongoing curriculum-based progress monitoring in accordance with their experimental treatments for 15 weeks. Achievement was measured at the beginning and end of the study and fidelity of treatment was assessed during the course of treatment. Multivariate analyses of variance indicated significant interactions. For the noncomputer group, performance was comparable within the two data utilization conditions. For the computer group, performance was superior within the goal-based evaluation condition. Patterns of results were similar on the student achievement and teacher fidelity measures. Implications for practice and additional research are discussed.