이동로봇의 위치 추정과 지도 생성을 위한 전방향 영상에서의 특징점 대응Feature matching in omni-directional images for mobile robot localization and map construction

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 757
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisor정명진-
dc.contributor.advisorChung, Myung-Jin-
dc.contributor.author이영진-
dc.contributor.authorLee, Young-Jin-
dc.date.accessioned2011-12-14-
dc.date.available2011-12-14-
dc.date.issued2003-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=231133&flag=dissertation-
dc.identifier.urihttp://hdl.handle.net/10203/35183-
dc.description학위논문(박사) - 한국과학기술원 : 전기및전자공학전공, 2003.8, [ xi, 131 p. ]-
dc.description.abstractThis dissertation proposes solutions to three problems involved in a mobile robot equipped with an omnidirectional vision sensor. The first one is a matching problem of finding correspondences of features in omnidirectional images. Conventional methods based on feature tracking have limitations when the sensor motion becomes large. To produce reliable matching results even though there are large translation and rotation of a sensor, we propose a method that combines the advantages of Sum of Squared Difference (SSD) and Dynamic Time Warping (DTW). Dominant corresponding feature pairs are found using a proximity matrix and a similarity matrix based on SSD, and then the remaining feature matching is accomplished by DTW. Distortions due to the conic mirror can be well treated by DTW and ‘initial point constraint’ for DTW is imposed by SSD even with large sensor motion. Experimental results show that a zero failure rate of matching can be achieved in an indoor environment even though there are translational sensor motion larger than 10cm and any amount of sensor rotation. When a feature is identified at more than two sensor locations, 2D position of the feature can be estimated by triangulation. The experimental results of map building are given to demonstrate the validity of the proposed feature matching method. The second problem we are dealing with is an absolute localization problem of a mobile robot. We devised a new linear method that can be used to find the position and orientation of a robot using only bearing measurements of landmarks. We also propose a method for finding correspondences between features in a 2D map and features in an image by integrating the proposed localization algorithm with ‘interpretation tree search’ algorithm. The primary advantage of the proposed method is that the localization is accomplished simultaneously in the matching phase. The localization algorithm and the feature matching method are presented and simulation results are addkor
dc.languagekor-
dc.publisher한국과학기술원-
dc.subject위치 추정-
dc.subject특징점 대응-
dc.subject이동 로봇-
dc.subject전방향 시각 센서-
dc.subject지도 생성-
dc.subjectmap building-
dc.subjectlocalization-
dc.subjectfeature matching-
dc.subjectmobile robot-
dc.subjectomnidirectional vision sensor-
dc.title이동로봇의 위치 추정과 지도 생성을 위한 전방향 영상에서의 특징점 대응-
dc.title.alternativeFeature matching in omni-directional images for mobile robot localization and map construction-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN231133/325007 -
dc.description.department한국과학기술원 : 전기및전자공학전공, -
dc.identifier.uid000965292-
dc.contributor.localauthor정명진-
dc.contributor.localauthorChung, Myung-Jin-
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0