Abstract
In early articles on this subject, it was asserted that the transformation which arises from the generalized optical theorem for inverse scattering of scalar waves is completely continuous in certain spaces. In this article we show this is not correct: This transformation is not compact in these spaces. We also obtain an improved uniqueness result for the case where there is no spherical symmetry. The article concludes with a discussion of the proper setting of the unitarity equation (generalized optical theorem) in the larger context of the inverse scattering problem.

This publication has 3 references indexed in Scilit: