다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 759
  • Download : 0
This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (Φ, Δ ) and a camera calibration matrix (K ). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.
Publisher
한국로봇학회
Issue Date
2009-12
Language
Korean
Citation

로봇학회 논문지, v.4, no.4, pp.298 - 304

ISSN
1975-6291
URI
http://hdl.handle.net/10203/96914
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0