Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion영상-관성-거리 센서 융합 기반의 다중 무인기 위치 추정

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 235
  • Download : 0
Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.
Publisher
Institute of Control, Robotics and Systems
Issue Date
2023-11
Language
English
Article Type
Article
Citation

Journal of Institute of Control, Robotics and Systems, v.29, no.11, pp.859 - 865

ISSN
1976-5622
DOI
10.5302/j.icros.2023.23.0135
URI
http://hdl.handle.net/10203/315014
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0