A Robust Top-Down Approach for Rotation Estimation and Vanishing Points Extraction by Catadioptric Vision in Urban Environment

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 398
  • Download : 1300
A key requirement for Unmanned Aerial Vehicles (UAV) applications is the attitude stabilization of the aircraft, which requires the knowledge of its orientation. It is now well established that traditional navigation equipments, like GPS or INS, suffer from several disadvantages. That is why some works have suggested a vision-based approach of the problem. Especially, catadioptric vision is more and more used since it permits to gather much more information from the environment, compared to traditional perspective cameras, and therefore the robustness of the UAV attitude estimation is improved. Rotation estimation from conventional and catadioptric images has been extensively studied. Whereas interesting results can be obtained, the existing methods have non-negligible limitations such as difficult features matching (e.g. repeated texture, blurring or illumination changing) or a high computational cost (e.g. vanishing point extraction or analyze in frequency domain). In order to overcome these limitations, this paper presents a top-down approach for estimating the rotation and extracting the vanishing points in catadioptric images. This new framework is accurate and can run in real-time. To obtain the ground truth data, we also calibrate our catadioptric camera with a gyroscope. Finally, experimental results on a real video sequence are presented and compared to the ground truth data obtained by the gyroscope.
Issue Date
2011-07-14
URI
http://hdl.handle.net/10203/24620
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
sample_new.pdf(1.53 MB)Download

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0