Online underwater optical mapping for trajectories with gaps

Cited 3 time in webofscience Cited 0 time in scopus
  • Hit : 495
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorElibol, Armaganko
dc.contributor.authorShim, Hyunjungko
dc.contributor.authorHong, Seonghunko
dc.contributor.authorKim, Jinwhanko
dc.contributor.authorGracias, Nunoko
dc.contributor.authorGarcia, Rafaelko
dc.date.accessioned2016-09-06T07:15:44Z-
dc.date.available2016-09-06T07:15:44Z-
dc.date.created2016-07-26-
dc.date.created2016-07-26-
dc.date.created2016-07-26-
dc.date.issued2016-07-
dc.identifier.citationINTELLIGENT SERVICE ROBOTICS, v.9, no.3, pp.217 - 229-
dc.identifier.issn1861-2776-
dc.identifier.urihttp://hdl.handle.net/10203/212245-
dc.description.abstractThis paper proposes a vision-only online mosaicing method for underwater surveys. Our method tackles a common problem in low-cost imaging platforms, where complementary navigation sensors produce imprecise or even missing measurements. Under these circumstances, the success of the optical mapping depends on the continuity of the acquired video stream. However, this continuity cannot be always guaranteed due to the motion blurs or lack of texture, common in underwater scenarios. Such temporal gaps hinder the extraction of reliable motion estimates from visual odometry, and compromise the ability to infer the presence of loops for producing an adequate optical map. Unlike traditional underwater mosaicing methods, our proposal can handle camera trajectories with gaps between time-consecutive images. This is achieved by constructing minimum spanning tree which verifies whether the current topology is connected or not. To do so, we embed a trajectory estimate correction step based on graph theory algorithms. The proposed method was tested with several different underwater image sequences and results were presented to illustrate the performance-
dc.languageEnglish-
dc.publisherSPRINGER HEIDELBERG-
dc.titleOnline underwater optical mapping for trajectories with gaps-
dc.typeArticle-
dc.identifier.wosid000378840600004-
dc.identifier.scopusid2-s2.0-84961644886-
dc.type.rimsART-
dc.citation.volume9-
dc.citation.issue3-
dc.citation.beginningpage217-
dc.citation.endingpage229-
dc.citation.publicationnameINTELLIGENT SERVICE ROBOTICS-
dc.identifier.doi10.1007/s11370-016-0195-4-
dc.contributor.localauthorShim, Hyunjung-
dc.contributor.localauthorKim, Jinwhan-
dc.contributor.nonIdAuthorElibol, Armagan-
dc.contributor.nonIdAuthorGracias, Nuno-
dc.contributor.nonIdAuthorGarcia, Rafael-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorUnderwater robotics-
dc.subject.keywordAuthorOptical mapping-
dc.subject.keywordAuthorImage mosaicing-
dc.subject.keywordAuthorEnvironmental monitoring-
dc.subject.keywordPlusKALMAN FILTER-
dc.subject.keywordPlusNAVIGATION-
Appears in Collection
AI-Journal Papers(저널논문)ME-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 3 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0