In order to set up magnetic resonance imaging (MRI) procedures of arbitrary voxel dimensions, slice orientation, and sequence timing in a reasonable time, some form of automatic gradient pulse calibration is required. One such method, involving simulation of gradient waveforms, is presented. Waveforms are modeled based on measurements of the step response. The model used divides each transition into three time regions: a "start" region in the first 0.3 ms, a "slew" region, and a "tail" region representing decay of the eddy current compensation error. In the "slew" region, the time derivative of the gradient, G' (t), is expressed as a function of G(t). The first two regions are nonlinear with respect to demand. The mean error in the simulated gradient is generally less than 0.04 mT m-1 in spin echo sequences. Image signal/noise ratios resulting from sequences calibrated using the model are within 5% of those of empirically calibrated sequences.