Abstract
Previous work has shown that the magnetoresistance of carbon resistor thermometers in applied magnetic fields below 50 kOe for temperatures in the liquid hydrogen and/or liquid helium range exhibits a quadratic dependence on field. This present investigation of the magnetoresistance of 110W , 100 Ω nominal room temperature resistance Allen‐Bradley carbon resistors in fields up to 100 kOe reconfirms the fact that the magnetoresistance increases as H2 for low fields, but indicates that a modification of this simple result is necessary for accurate thermometry in fields above 50 kOe. It is shown that the percentage change in resistance due to the application of a magnetic field as a function of temperature and field can be described by the equation 100ΔR/R = CH2T−1.5/ (1+H2H0−2), where R is the resistance in zero field, ΔR the change in resistance when a field H is applied, and T the temperature. C is a constant which depends on the room temperature resistance of the thermometer, and H0 is an adjustable constant. Temperature accuracies of the order of 25 mdeg in an ambient field up to 100 kOe may easily be obtained.