DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Chung, Myung-Jin | - |
dc.contributor.advisor | 정명진 | - |
dc.contributor.author | Lee, Hui-Sung | - |
dc.contributor.author | 이희승 | - |
dc.date.accessioned | 2011-12-14 | - |
dc.date.available | 2011-12-14 | - |
dc.date.issued | 2008 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=295408&flag=dissertation | - |
dc.identifier.uri | http://hdl.handle.net/10203/35448 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 전기및전자공학전공, 2008.2, [ xvi, 120 p. ] | - |
dc.description.abstract | A robot`s face is its symbolic feature and its facial expressions are the best method for interacting with people with emotional information. Moreover, a robot`s facial expressions play an important role in human-robot emotional interactions. Just as we identify a person through his or her face, we should be able to distinguish a robot through its face. Furthermore, the techniques which enable robots` faces to express emotions will become a crucial factor in robot development and embodiment in the future. People expect to see a humanoid or android expressing its internal or emotional state in similar ways to humans. Humans can immediately recognize the internal state of robots with facial expressions. Humanlike facial expressions should be used for robots to effectively display easily recognizable emotional interaction. Therefore, the development of human-friendly robot systems that enable robots to show their emotions to people is necessary. Numerous and various robot heads and faces have been made thus far, and different facial forms require different ways to embody the expressions of each one of them. There is no general rule for creating this process, and there has been an insufficient amount of objective resources in the quest to determine where and how many control points of the robot face should be placed. Although various methods exist that enable the embodiment of robot expressions, most robots are unable to express continuous and natural changes of facial expressions efficiently and can only show predefined and limited emotions. In order to develop human-friendly robot systems that enable robots to show their emotions effectively, it should be considered how to design facial robots to be friendly to human and how to effectively control the expressional components. This dissertation organizes a general rule and procedure for the design and realization of expressions when some mascot-type facial robots are developed. A mascot-type facial robo... | eng |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | expression | - |
dc.subject | affect space | - |
dc.subject | emotion | - |
dc.subject | facial robot | - |
dc.subject | dynamic emotion | - |
dc.subject | 표정 | - |
dc.subject | 정서공간 | - |
dc.subject | 감정 | - |
dc.subject | 얼굴 로봇 | - |
dc.subject | 동적 감정 | - |
dc.subject | expression | - |
dc.subject | affect space | - |
dc.subject | emotion | - |
dc.subject | facial robot | - |
dc.subject | dynamic emotion | - |
dc.subject | 표정 | - |
dc.subject | 정서공간 | - |
dc.subject | 감정 | - |
dc.subject | 얼굴 로봇 | - |
dc.subject | 동적 감정 | - |
dc.title | (A) linear affect-expression space model and mascot-type facial robot for effective expressions | - |
dc.title.alternative | 효과적인 표정 구현을 위한 선형 정서-표정 공간 모델과 마스코트형 얼굴 로봇 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 295408/325007 | - |
dc.description.department | 한국과학기술원 : 전기및전자공학전공, | - |
dc.identifier.uid | 020025252 | - |
dc.contributor.localauthor | Chung, Myung-Jin | - |
dc.contributor.localauthor | 정명진 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.