Ultimate quantum limits on phase measurement

Abstract
The Susskind-Glogower (SG) phase operator is shown to be the maximum-likelihood (ML) quantum measurement of optical phase for all input states. The ML performance of the SG measurement is optimized, at average photon number N, by an input state with number-ket representation of the form ψn=A/(1+n), for 0≤n≤M<∞. The optimum phase-error behavior, measured by the reciprocal peak-likelihood, is proportional to 1/N2, as opposed to the 1/ √N and 1/N dependences of coherent-state and optimum squeezed-state interferometers.