Design and Development of an Emotional Interaction Robot, Mung

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 894
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, EHko
dc.contributor.authorKwak, SSko
dc.contributor.authorHyun, KHko
dc.contributor.authorKim, Soohyunko
dc.contributor.authorKwak, YKko
dc.date.accessioned2013-03-09T16:17:28Z-
dc.date.available2013-03-09T16:17:28Z-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.issued2009-
dc.identifier.citationADVANCED ROBOTICS, v.23, no.6, pp.767 - 784-
dc.identifier.issn0169-1864-
dc.identifier.urihttp://hdl.handle.net/10203/96835-
dc.description.abstractThe purpose of this study is to develop an interactive and emotional robot that is designed with passive characteristics and a delicate interaction concept using the interaction response: 'Bruises and complexion color due to emotional stimuli'. In order to overcome the mismatch of cue realism issues that cause the Uncanny Valley, an emotional interaction robot, Mung, was developed with a simple design composed of a body and two eyes. Mung could recognize human emotions from human-robot or human-human verbal communications. The developed robot expresses its emotions according to its emotional state modeled by a mass-spring-damper system with an elastic-hysteresis spring. The robot displays a bruise when it is in a negative emotional state, the same as a human becomes bruised when being physically hurt, and the robot shows a natural complexion when its emotional wound is removed. The effectiveness of the emotional expression using color with the concepts of bruising and complexion colors is qualified, and the feasibility of the developed robot was tested in several exhibitions and field trials. (c) Koninklijke Brill NV, Leiden and The Robotics Society of Japan, 2009-
dc.languageEnglish-
dc.publisherVSP BV-
dc.subjectSPEECH-
dc.subjectHRI-
dc.titleDesign and Development of an Emotional Interaction Robot, Mung-
dc.typeArticle-
dc.identifier.wosid000265821200007-
dc.identifier.scopusid2-s2.0-67649336998-
dc.type.rimsART-
dc.citation.volume23-
dc.citation.issue6-
dc.citation.beginningpage767-
dc.citation.endingpage784-
dc.citation.publicationnameADVANCED ROBOTICS-
dc.identifier.doi10.1163/156855309X431712-
dc.contributor.localauthorKim, Soohyun-
dc.contributor.localauthorKwak, YK-
dc.contributor.nonIdAuthorKim, EH-
dc.contributor.nonIdAuthorKwak, SS-
dc.contributor.nonIdAuthorHyun, KH-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorEmotional interaction robot-
dc.subject.keywordAuthorintelligent robots-
dc.subject.keywordAuthorhuman-robot interaction-
dc.subject.keywordAuthorspeech emotion recognition-
dc.subject.keywordAuthoremotional model-
dc.subject.keywordPlusSPEECH-
dc.subject.keywordPlusHRI-
Appears in Collection
ME-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0