Video-guided motion capture and transfer using example motions예제동작을 이용한 영상기반 동작포착 및 동작이전

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 468
  • Download : 0
In this thesis, we present a novel example-based paradigm for producing motions of various characters. With a rich repertoire of captured motions, our paradigm extends previous methods for motion synthesis in two directions: First, we produce realistic human motions guided by monocular videos, which are a most common source of human motions. Second, we apply those motions to diverse characters with different structures. After processing a input video, we select a pre-captured motion clip called a "reference motion" from a motion library, and then compute the sequence of body configurations of a character based on spacetime formulation. The root trajectory is estimated by using kinematic constraints and dynamic property of character. After synthesizing human motions from monocular videos, the motion is cloned frame by frame to a target character based on scattered data interpolation. To do this, we exploit the correspondence between keypostures of the source character and those of the target character. Through experiments, we demonstrated that our scheme can effectively produce a variety of motions for character animation.
Advisors
Shin, Sung-Yongresearcher신성용researcher
Description
한국과학기술원 : 전산학전공,
Publisher
한국과학기술원
Issue Date
2005
Identifier
244974/325007  / 000995136
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전산학전공, 2005.2, [ iv, 68 p. ]

Keywords

analysis; Computer Graphics; motion; 동작이전; 동작포착; 컴퓨터 그래픽스

URI
http://hdl.handle.net/10203/32891
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=244974&flag=dissertation
Appears in Collection
CS-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0