Real-Time Visual SLAM for Autonomous Underwater Hull Inspection Using Visual Saliency

Cited 76 time in webofscience Cited 0 time in scopus
  • Hit : 271
  • Download : 0
This paper reports a real-time monocular visual simultaneous localization and mapping (SLAM) algorithm and results for its application in the area of autonomous underwater ship hull inspection. The proposed algorithm overcomes some of the specific challenges associated with underwater visual SLAM, namely, limited field of view imagery and feature-poor regions. It does so by exploiting our SLAM navigation prior within the image registration pipeline and by being selective about which imagery is considered informative in terms of our visual SLAM map. A novel online bag-of-words measure for intra and interimage saliency are introduced and are shown to be useful for image key-frame selection, information-gain-based link hypothesis, and novelty detection. Results from three real-world hull inspection experiments evaluate the overall approach, including one survey comprising a 3.4-h/2.7-km-long trajectory.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2013-06
Language
English
Article Type
Article
Keywords

BAG-OF-WORDS; SIMULTANEOUS LOCALIZATION; IMAGE CLASSIFICATION; NAVIGATION; APPEARANCE; FEATURES; ROBOT; PERCEPTION; ELEMENTS; TEXTONS

Citation

IEEE TRANSACTIONS ON ROBOTICS, v.29, no.3, pp.719 - 733

ISSN
1552-3098
DOI
10.1109/TRO.2012.2235699
URI
http://hdl.handle.net/10203/192392
Appears in Collection
CE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 76 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0