Abstract
This paper presents a description of the 'sensor-fusion' algorithm for a navigation system suitable for autonomous guided vehicles that use two navigation systems: an odometric one which uses encoders mounted on the vehicle's wheels and an inertial one which uses a gyroscope. In the proposed algorithm the accuracy of both navigation systems is estimated as a function of the actual manoeuvre being carried out, which is identified by navigation data, and then the outputs of both sensors are fused taking into account the accuracy ratios. The method is explained starting from the measurement models and its calibration as a function of the different manoeuvres. Experimental calibration of the proposed algorithm was carried out using a scaled mock-up of an industrial robot. While the mean absolute value of the estimated error in distance over the path is 49 mm when using odometric navigation and 28 mm when using odometric plus inertial attitude navigation, it is only 15 mm when carrying out data fusion between the two measurement systems. A recursive method for estimating the evolution of spatial uncertainty, which also takes into account the systematic effects, is proposed.