Camera-laser fusion sensor system and environmental recognition for humanoids in disaster scenarios

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 478
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Inhoko
dc.contributor.authorOh, Jaesungko
dc.contributor.authorKim, Inhyeokko
dc.contributor.authorOh, Jun-Hoko
dc.date.accessioned2017-10-23T02:39:34Z-
dc.date.available2017-10-23T02:39:34Z-
dc.date.created2017-10-16-
dc.date.created2017-10-16-
dc.date.issued2017-06-
dc.identifier.citationJOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, v.31, no.6, pp.2997 - 3003-
dc.identifier.issn1738-494X-
dc.identifier.urihttp://hdl.handle.net/10203/226646-
dc.description.abstractThis research aims to develop a vision sensor system and a recognition algorithm to enable a humanoid to operate autonomously in a disaster environment. In disaster response scenarios, humanoid robots that perform manipulation and locomotion tasks must identify the objects in the environment from those challenged by the call by the United States' Defense Advanced Research Projects Agency, e.g., doors, valves, drills, debris, uneven terrains, and stairs, among others. In order for a humanoid to undertake a number of tasks, we construct a camera-laser fusion system and develop an environmental recognition algorithm. Laser distance sensor and motor are used to obtain 3D cloud data. We project the 3D cloud data onto a 2D image according to the intrinsic parameters of the camera and the distortion model of the lens. In this manner, our fusion sensor system performs functions such as those performed by the RGB-D sensor generally used in segmentation research. Our recognition algorithm is based on super-pixel segmentation and random sampling. The proposed approach clusters the unorganized cloud data according to geometric characteristics, namely, proximity and co-planarity. To assess the feasibility of our system and algorithm, we utilize the humanoid robot, DRC-HUBO, and the results are demonstrated in the accompanying video.-
dc.languageEnglish-
dc.publisherKOREAN SOC MECHANICAL ENGINEERS-
dc.subjectIMAGE SEGMENTATION-
dc.titleCamera-laser fusion sensor system and environmental recognition for humanoids in disaster scenarios-
dc.typeArticle-
dc.identifier.wosid000411707900002-
dc.identifier.scopusid2-s2.0-85025692480-
dc.type.rimsART-
dc.citation.volume31-
dc.citation.issue6-
dc.citation.beginningpage2997-
dc.citation.endingpage3003-
dc.citation.publicationnameJOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY-
dc.identifier.doi10.1007/s12206-017-0543-0-
dc.contributor.localauthorOh, Jun-Ho-
dc.contributor.nonIdAuthorLee, Inho-
dc.contributor.nonIdAuthorKim, Inhyeok-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorHumanoid vision system-
dc.subject.keywordAuthorCamera-laser fusion-
dc.subject.keywordAuthorDisaster response scenario-
dc.subject.keywordAuthorEnvironmental recognition-
dc.subject.keywordPlusIMAGE SEGMENTATION-
Appears in Collection
ME-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0