Abstract
The quality of an imaging system is degraded by propagation anomalies that distort wavefronts propagating through the medium. Adaptive phase-deaberration algorithms compensate for phase errors in the wavefront. The algorithms suffer, however, when the wavefront is also significantly distorted. A theory which shows that the rise of image background level, which is the average sidelobe floor (ASF), in a single point-like source image is proportional to the amplitude distortion of the wavefront and inversely proportional to the effective number of array elements is derived. From the theory, the tolerance to the amplitude distortion, after the phasefront has been corrected by a deaberration algorithm, can be calculated based on the design requirement of the sidelobe floor for a given array. Computer simulations show good agreement with the theory.<>

This publication has 23 references indexed in Scilit: