Abstract
Electrical resistivity of binary chromium alloys containing 1.02, 2.04, and 3.06 at. % vanadium; 0.33, 0.57, and 1.13 at. % niobium; and 0.14, 0.30, and 0.58 at. % tantalum have been measured as a function of temperature between 4 and 325 °K. These studies show that the Néel temperature, TN, decreases with increasing solute concentration in such a manner that ln T plots are linear with respect to the solute content. The curve of TN vs. solute concentration for tantalum, within the solubility limit, overlaps that for vanadium. The corresponding curve for niobium is slightly above the curve associated with vanadium and tantalum. This behavior cannot be explained simply from the viewpoint of the localization of d-wave functions using the model of Fedders and Martin. The increase in the residual electrical resistivity at 4.2 °K due to 1 at. % vanadium, niobium, and tantalum has been found to be 0.45, 2.7, and 3.6 μΩ cm, respectively. The residual resistivity vs. vanadium concentration curve shows an anomaly at about 3 at. % due to the changes in the Fermi surfaces, because of the disappearance of the antiferromagnetic state. None of the binary chromium alloys exhibit the Kim effect at low temperatures found in certain chromium alloys.