LiDAR-Based Object-Level SLAM for Autonomous Vehicles
Bingyi Cao, Daniel Göhring, Andreas Philipp – 2021
Simultaneous localization and mapping (SLAM) is an essential technique for autonomous driving. Recently, combining image recognition technology to generate semantically meaningful maps has become a new trend in visual SLAM research. However, in the field of LiDAR SLAM, this potential has not been fully explored. We propose a novel object-level SLAM system using 3D LiDARs for autonomous vehicles. We detect and track poles, walls, and parked cars, which are common along urban roads. This paper presents how we process the measurement data of three different shapes of objects to build a graph-based optimization system and facilitate the geometric distribution of poles to detect loops. Experiments were carried out on datasets collected with a test vehicle in city traffic. The results show that our object-level SLAM system can build precise and semantically meaningful maps and produce more accurate pose estimations compared to the state-of-the-art systems on our datasets.