MIR-VIO: Mutual Information Residual-based Visual Inertial Odometry with UWB Fusion for Robust Localization

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 207
  • Download : 0
For many years, there has been an impressive progress on visual odometry applied to mobile robots and drones. However, the visual perception is still in the spotlight as a challenging field because the vision sensor has some problems in obtaining correct scale information with a monocular camera and also is vulnerable to a situation in which illumination is changed. In this paper, UWB sensor fusion is proposed in the visual inertial odometry algorithm as a solution to mitigate this problem. We designed a cost function based on mutual information considering the UWB. Considering the characteristic of the UWB signal model, where the uncertainty increases as the distance between the UWB anchor and the tag increases, we introduced a new residual term to the cost function. When the experiment was conducted in an indoor environment with the above methodology, the initialization problem in an environment with few feature points was solved through the UWB sensor fusion, and localization became robust. And when the residual term using the concept of mutual information was used, the most robust odometry could be obtained.
Publisher
Institute of Control, Robotics and Systems
Issue Date
2021-10-12
Language
English
Citation

21st International Conference on Control, Automation and Systems (ICCAS), pp.91 - 96

ISSN
2093-7121
DOI
10.23919/ICCAS52745.2021.9649888
URI
http://hdl.handle.net/10203/288390
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0