autonomous-driving system, environmental perception, simulated test, deep-learning model
Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15 000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.
Tsinghua University Press
Lin, Chunmian; Tian, Daxin; Duan, Xuting; and Zhou, Jianshan
"3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems,"
Complex System Modeling and Simulation: Vol. 1:
1, Article 4.
Available at: https://dc.tsinghuajournals.com/complex-system-modeling-and-simulation/vol1/iss1/4