DynaVINS: A Visual-Inertial SLAM for Dynamic Environments

Cited 28 time in webofscience Cited 0 time in scopus
  • Hit : 209
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorSong, Seung Wonko
dc.contributor.authorLim, HyungTaeko
dc.contributor.authorLee, Junhoko
dc.contributor.authorMyung, Hyunko
dc.date.accessioned2022-09-19T02:00:14Z-
dc.date.available2022-09-19T02:00:14Z-
dc.date.created2022-09-13-
dc.date.created2022-09-13-
dc.date.created2022-09-13-
dc.date.issued2022-10-
dc.identifier.citationIEEE ROBOTICS AND AUTOMATION LETTERS, v.7, no.4, pp.11523 - 11530-
dc.identifier.issn2377-3766-
dc.identifier.urihttp://hdl.handle.net/10203/298574-
dc.description.abstractVisual inertial odometry and SLAM algorithms are widely used in various fields, such as service robots, drones, and autonomous vehicles. Most of the SLAM algorithms are based on assumption that landmarks are static. However, in the real-world, various dynamic objects exist, and they degrade the pose estimation accuracy. In addition, temporarily static objects, which are static during observation but move when they are out of sight, trigger false positive loop closings. To overcome these problems, we propose a novel visual-inertial SLAM framework, called DynaVINS, which is robust against both dynamic objects and temporarily static objects. In our framework, we first present a robust bundle adjustment that could reject the features from dynamic objects by leveraging pose priors estimated by the IMU preintegration. Then, a keyframe grouping and a multi-hypothesis-based constraints grouping methods are proposed to reduce the effect of temporarily static objects in the loop closing. Subsequently, we evaluated our method in a public dataset that contains numerous dynamic objects. Finally, the experimental results corroborate that our DynaVINS has promising performance compared with other state-of-the-art methods by successfully rejecting the effect of dynamic and temporarily static objects.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleDynaVINS: A Visual-Inertial SLAM for Dynamic Environments-
dc.typeArticle-
dc.identifier.wosid000850870500005-
dc.identifier.scopusid2-s2.0-85137583753-
dc.type.rimsART-
dc.citation.volume7-
dc.citation.issue4-
dc.citation.beginningpage11523-
dc.citation.endingpage11530-
dc.citation.publicationnameIEEE ROBOTICS AND AUTOMATION LETTERS-
dc.identifier.doi10.1109/LRA.2022.3203231-
dc.contributor.localauthorMyung, Hyun-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorVisual-inertial SLAM-
dc.subject.keywordAuthorSLAM-
dc.subject.keywordAuthorvisual tracking-
dc.subject.keywordPlusROBUST-
dc.subject.keywordPlusTRACKING-
dc.subject.keywordPlusVERSATILE-
dc.subject.keywordPlusODOMETRY-
dc.subject.keywordPlusFILTER-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 28 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0