Abstract
During the last years DLR has developed an Enhanced and Synthetic Vision (ESVS) test bed, consisting of several imaging sensors mounted onto the DLR's DO-228 test aircraft together with some built-in computing devices for acquiring, storing and displaying the sensor data. The Institute of Flight Guidance of DLR has documented its fundamental experiences and investigations in realtime processing, image analysing, data fusion and the extraction of navigation data especially from millimetre-wave-radar images by publishing several contributions discussing the generation of "useful" displays for the pilot's support. The present document discloses the development of a considerable robust and simple method for the estimation of the relative position of an aircraft with respect to a runway based on camera images only (TV, infrared or PMMW radar). The special feature of this method is, that neither a calibrated camera (referring to focus length and mounting angles relative to the aircraft) is required nor the identification of special points of the runway and their 3-D location has to be known. The only reference to the 3-D world, which has to be known, is the width of the runway stripe. The presented algorithm computes the relative height of the aircraft above the runway stripe and the lateral deviation from the centre line of the runway as well. The robustness and the minimum requirement of real-world 3-D data yield a lower effort for certification, calibration and maintenance of a camera based positioning system. Furthermore, the presented method is not restricted to the ESVS application. The estimation of lateral guidance data for a car driving on a road, is another example for an application. In this case, there is no real world 3-D knowledge about the road required. Solely the data of the mounting height of the camera ab

This publication has 2 references indexed in Scilit: