Bird’s eye view localization of surrounding vehicles: Longitudinal and lateral distance estimation with partial appearance

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 95
  • Download : 0
On-road vehicle detection is essential for perceiving driving settings, and localizing the detected vehicle helps drivers predict possible risks and avoid collisions. However, there are limited works on vehicle detection with partial appearance, and the method for partially visible vehicle localization has not been explored. In this paper, a novel framework for vehicle detection and localization with partial appearance is proposed using stereo vision and geometry. First, the original images from the stereo camera are processed to form a v-disparity map. After object detection using v-disparity, vehicle candidates are generated with prior knowledge of possible vehicle locations on the image. Deep learning-based verification completes vehicle detection. For each detected vehicle, partially visible vehicle tracking algorithm is newly introduced. To track partially visible vehicles, this algorithm detects the vehicle edge on the ground, defined as the grounded edge, and then selects a reference point for Kalman filter tracking. Finally, a rectangular box is drawn on the bird’s eye view to represent vehicle’s longitudinal and lateral location. The proposed system successfully performs partially visible vehicle detection and tracking. For testing the localization performance, the datasets in a highway and an urban setting are used and provide less than 1.5 m longitudinal error and 0.4 m lateral error in standard deviation.
Publisher
ELSEVIER SCIENCE BV
Issue Date
2019-02
Language
English
Citation

ROBOTICS AND AUTONOMOUS SYSTEMS, v.112, pp.178 - 189

ISSN
0921-8890
DOI
10.1016/j.robot.2018.11.008
URI
http://hdl.handle.net/10203/248981
Appears in Collection
GT-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0