Abstract
The conventional least-squared-distance method of fitting a line to a set of data points is unreliable when the amount of random noise in the input (such as an image) is significant compared with the amount of data correlated to the line itself. Points which are far away from the line (outliers) are usually just noise, but they contribute the most to the distance averaging, skewing the line from its correct position. The author presents a statistical method of separating the data of interest from random noise, using a maximum-likelihood principle.