A Sensor-Fusion Drivable-Region and Lane-Detection System for Autonomous Vehicle Navigation in Challenging Road Scenarios

Abstract
Autonomous vehicle navigation is challenging since various types of road scenarios in real urban environments have to be considered, particularly when only perception sensors are used, without position information. This paper presents a novel real-time optimal-drivable-region and lane detection system for autonomous driving based on the fusion of light detection and ranging (LIDAR) and vision data. Our system uses a multisensory scheme to cover the most drivable areas in front of a vehicle. We propose a feature-level fusion method for the LIDAR and vision data and an optimal selection strategy for detecting the best drivable region. Then, a conditional lane detection algorithm is selectively executed depending on the automatic classification of the optimal drivable region. Our system successfully handles both structured and unstructured roads. The results of several experiments are provided to demonstrate the reliability, effectiveness, and robustness of the system.