Signal-to-noise requirements for interpreting submillimetre laser scattering experiments in a Tokamak plasma

Abstract
A Monte Carlo technique has been used to generate numerical simulations of the collective spectra of laser radiation scattered by plasma as they would be measured in a heterodyne receiver. Plasmas having ion temperatures in the range 500-5000 eV, contaminated by up to 2 per cent fully stripped oxygen were investigated. The receiver output signal-to-noise ratio S required to measure Ti to better than 15 per cent by a chi-square curve fitting procedure was found to be about 6 but even poor estimates of impurity concentration demanded much higher values of S. Since signal-to noise ratios at the input and the output of the heterodyne receiver are almost independent when the former exceeds one, only the width of the resolution intervals and the integration time (laser pulse length) exert an appreciable influence on S, and optimum values for these parameters are investigated.