Real-time humanoid whole-body remote control framework for imitating human motion based on kinematic mapping and motion constraints

Cited 11 time in webofscience Cited 9 time in scopus
  • Hit : 503
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorOh, Jaesungko
dc.contributor.authorLee, In-Hoko
dc.contributor.authorJeong, Hyobinko
dc.contributor.authorOh, Jun-Hoko
dc.date.accessioned2019-04-24T13:14:00Z-
dc.date.available2019-04-24T13:14:00Z-
dc.date.created2019-04-22-
dc.date.issued2019-03-
dc.identifier.citationADVANCED ROBOTICS, v.33, no.6, pp.293 - 305-
dc.identifier.issn0169-1864-
dc.identifier.urihttp://hdl.handle.net/10203/261492-
dc.description.abstractIn this paper, we propose a whole-body remote control framework that enables a robot to imitate human motion efficiently. The framework is divided into kinematic mapping and quadratic programming based whole-body inverse kinematics. In the kinematic mapping, the human motion obtained through a data acquisition device is transformed into a reference motion that is suitable for the robot to follow. To address differences in the kinematic configuration and dynamic properties of the robot and human, quadratic programming is used to calculate the joint angles of the robot considering self-collision, joint limits, and dynamic stability. To address dynamic stability, we use constraints based on the divergent component of motion and zero moment point in the linear inverted pendulum model. Simulation using Choreonoid and a locomotion experiment using the HUBO2+ demonstrate the performance of the proposed framework. The proposed framework has the potential to reduce the preview time or offline task computation time found in previous approaches and hence improve the similarity of human and robot motion while maintaining stability.-
dc.languageEnglish-
dc.publisherTAYLOR & FRANCIS LTD-
dc.titleReal-time humanoid whole-body remote control framework for imitating human motion based on kinematic mapping and motion constraints-
dc.typeArticle-
dc.identifier.wosid000463855300003-
dc.identifier.scopusid2-s2.0-85062450852-
dc.type.rimsART-
dc.citation.volume33-
dc.citation.issue6-
dc.citation.beginningpage293-
dc.citation.endingpage305-
dc.citation.publicationnameADVANCED ROBOTICS-
dc.identifier.doi10.1080/01691864.2019.1581658-
dc.contributor.localauthorOh, Jun-Ho-
dc.contributor.nonIdAuthorJeong, Hyobin-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorReal-time human motion imitation-
dc.subject.keywordAuthorwhole-body remote control-
dc.subject.keywordAuthorwhole-body teleoperation-
dc.subject.keywordAuthorwhole-body master-slave system-
dc.subject.keywordAuthorhuman-like motion generation-
dc.subject.keywordPlusROBOTS-
Appears in Collection
ME-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 11 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0