Vision and LIDAR-based autonomous navigation system for indoor and outdoor flight of unmanned aerial vehicles무인항공기의 실내외 비행을 위한 영상 및 LIDAR 기반의 자율항법시스템 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 579
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorShim, Hyun-Chul-
dc.contributor.advisor심현철-
dc.contributor.authorHuh, Sung-Sik-
dc.contributor.author허성식-
dc.date.accessioned2015-04-23T02:06:51Z-
dc.date.available2015-04-23T02:06:51Z-
dc.date.issued2014-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=591870&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/196189-
dc.description학위논문(박사) - 한국과학기술원 : 항공우주공학전공, 2014.8, [ vii, 65p ]-
dc.description.abstractThe Global Positioning System (GPS) is widely used to aid the navigation of aerial vehicles. However, the GPS cannot be used indoors. Further, indoor environments are sometimes cluttered and there are many obstacles that can be dangerous if collisions occur. Therefore, alternative navigation methods need to be developed for unmanned aerial vehicles (UAVs) flying in GPS-denied environments. An autonomous navigation system is required to provide a six-degree-of-freedom (6-DOF) pose and a three-dimensional (3-D) map. In this dissertation, a vision and scanning Light Detection and Ranging (LIDAR)-based autonomous navigation system for indoor and outdoor flight of UAVs is proposed. First, an integrated navigation sensor module that includes a camera, a LIDAR, and an inertial measurement unit (IMU) was developed to enable UAVs to fly both indoors and outdoors. To calibrate the camera and gimbaled LIDAR, a new method is proposed that uses a simple visual marker. The camera and the gimbaled LIDAR work in a complementary manner to extract feature points and to merge them with the LIDAR range for state estimation. The features are processed using an online Extended Kalman Filter-Simultaneous Localization and Mapping (EKF-SLAM) algorithm to estimate the navigational states of the vehicle. These sensors and the EKF-SLAM algorithms were implemented on an octo-rotor UAV platform and were tested. The results show that the navigation module can provide a real-time 3-D navigation solution without the need for prior information on the surroundings. Second, a real-time 3-D indoor navigation system and closed-loop control of a quad-rotor aerial vehicle equipped with an IMU and a low-cost LIDAR is presented. In order to estimate the pose of the vehicle equipped with the LIDAR, an octree-based grid map and Monte Carlo Localization (MCL) are adopted. The navigation results using the MCL are then evaluated by making a comparison with a motion capture system. Finally, the results are ...eng
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectAutonomous Navigation System-
dc.subject무인항공기-
dc.subjectMonte Carlo Localization-
dc.subjectEKF-SLAM-
dc.subject영상-LIDAR 센서 융합-
dc.subject3차원 실내항법-
dc.subjectIndoor 3-D Navigation-
dc.subjectVision-LIDAR Sensor Fusion-
dc.subjectEKF-SLAM-
dc.subjectMonte Carlo Localization-
dc.subjectUnmanned Aerial Vehicles-
dc.subject자율항법시스템-
dc.titleVision and LIDAR-based autonomous navigation system for indoor and outdoor flight of unmanned aerial vehicles-
dc.title.alternative무인항공기의 실내외 비행을 위한 영상 및 LIDAR 기반의 자율항법시스템 연구-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN591870/325007 -
dc.description.department한국과학기술원 : 항공우주공학전공, -
dc.identifier.uid020095180-
dc.contributor.localauthorShim, Hyun-Chul-
dc.contributor.localauthor심현철-
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0