Long-Range Order and Spin Reduction in Magnetic-Chain Crystals

Abstract
A magnetic phase transition in a system of loosely coupled antiferromagnetic linear chains is discussed. The low dimensionality of the magnetic lattice gives a large zero-point spin reduction. This large spin reduction strongly influences the field dependence of the sublattice magnetization. Experiments show that the magnitude of the sublattice magnetization depends in a very special way on both applied field and temperature. This is in contrast with a normal three-dimensional antiferromagnet, where the magnitude of the sublattice magnetization is mainly determined by T and not by H0.