Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites

Cited 5 time in webofscience Cited 0 time in scopus
  • Hit : 113
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorHong, Sungchulko
dc.contributor.authorShyam, Pranjayko
dc.contributor.authorBangunharcana, Antyantako
dc.contributor.authorShin, Hyuseoungko
dc.date.accessioned2022-04-15T06:42:19Z-
dc.date.available2022-04-15T06:42:19Z-
dc.date.created2022-03-21-
dc.date.created2022-03-21-
dc.date.created2022-03-21-
dc.date.issued2022-02-
dc.identifier.citationREMOTE SENSING, v.14, no.4-
dc.identifier.issn2072-4292-
dc.identifier.urihttp://hdl.handle.net/10203/294749-
dc.description.abstractIn planetary construction, the semiautonomous teleoperation of robots is expected to perform complex tasks for site preparation and infrastructure emplacement. A highly detailed 3D map is essential for construction planning and management. However, the planetary surface imposes mapping restrictions due to rugged and homogeneous terrains. Additionally, changes in illumination conditions cause the mapping result (or 3D point-cloud map) to have inconsistent color properties that hamper the understanding of the topographic properties of a worksite. Therefore, this paper proposes a robotic construction mapping approach robust to illumination-variant environments. The proposed approach leverages a deep learning-based low-light image enhancement (LLIE) method to improve the mapping capabilities of the visual simultaneous localization and mapping (SLAM)-based robotic mapping method. In the experiment, the robotic mapping system in the emulated planetary worksite collected terrain images during the daytime from noon to late afternoon. Two sets of point-cloud maps, which were created from original and enhanced terrain images, were examined for comparison purposes. The experiment results showed that the LLIE method in the robotic mapping method significantly enhanced the brightness, preserving the inherent colors of the original terrain images. The visibility and the overall accuracy of the point-cloud map were consequently increased.-
dc.languageEnglish-
dc.publisherMDPI-
dc.titleRobotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites-
dc.typeArticle-
dc.identifier.wosid000767171400001-
dc.identifier.scopusid2-s2.0-85125006896-
dc.type.rimsART-
dc.citation.volume14-
dc.citation.issue4-
dc.citation.publicationnameREMOTE SENSING-
dc.identifier.doi10.3390/rs14041027-
dc.contributor.nonIdAuthorHong, Sungchul-
dc.contributor.nonIdAuthorShin, Hyuseoung-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorplanetary construction-
dc.subject.keywordAuthorrobotic mapping-
dc.subject.keywordAuthorSLAM-
dc.subject.keywordAuthorlow-light enhancement-
dc.subject.keywordAuthor3D point-cloud map-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordPlusIMAGE-ENHANCEMENT-
dc.subject.keywordPlusWATER ICE-
dc.subject.keywordPlusSLAM-
dc.subject.keywordPlusMOON-
dc.subject.keywordPlusLOCALIZATION-
dc.subject.keywordPlusBRIGHTNESS-
dc.subject.keywordPlusRETINEX-
dc.subject.keywordPlusSTEREO-
Appears in Collection
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 5 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0