Camera and Radar Sensor Fusion for Robust Vehicle Localization via Vehicle Part Localization

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 44
  • Download : 0
Many production vehicles are now equipped with both cameras and radar in order to provide various driver-assistance systems (DAS) with position information of surrounding objects. These sensors, however, cannot provide position information accurate enough to realize highly automated driving functions and other advanced driver-assistance systems (ADAS). Sensor fusion methods were proposed to overcome these limitations, but they tend to show limited detection performance gain in terms of accuracy and robustness. In this study, we propose a camera-radar sensor fusion framework for robust vehicle localization based on vehicle part (rear corner) detection and localization. The main idea of the proposed method is to reinforce the azimuth angle accuracy of the radar information by detecting and localizing the rear corner part of the target vehicle from an image. This part-based fusion approach enables accurate vehicle localization as well as robust performance with respect to occlusions. For efficient part detection, several candidate points are generated around the initial radar point. Then, a widely adopted deep learning approach is used to detect and localize the left and right corners of target vehicles. The corner detection network outputs their reliability score based on the localization uncertainty of the center point in corner parts. Using these position reliability scores along with a particle filter, the most probable rear corner positions are estimated. Estimated positions (pixel coordinate) are translated into angular data, and the surrounding vehicle is localized with respect to the ego-vehicle by combining the angular data of the rear corner and the radar & x2019;s range data in the lateral and longitudinal direction. The experimental test results show that the proposed method provides significantly better localization performance in the lateral direction, with greatly reduced maximum errors (radar: 3.02m, proposed method: 0.66m) and root mean squared errors (radar: 0.57m, proposed method: 0.18m).
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2020-05
Language
English
Article Type
Article
Citation

IEEE ACCESS, v.8, pp.75223 - 75236

ISSN
2169-3536
DOI
10.1109/ACCESS.2020.2985075
URI
http://hdl.handle.net/10203/274341
Appears in Collection
GT-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0