Motion-to-tactile rendering framework with wearable haptic devices for immersive vr performance experienc고몰입 VR공연 경험을 위한 아바타 동작 기반 웨어러블 햅틱 렌더링 프레임워크 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 106
  • Download : 0
We present a novel haptic rendering framework that translates the performer’s motions into wearable vibrotactile feedback for an immersive virtual reality (VR) performance experience. Here, we employ a rendering pipeline that extracts meaningful vibrotactile parameters including intensity and location. We compute these parameters from the performer’s upper-body movements which play a significant role in a dance performance. Therefore, we customize a haptic vest and sleeves to support vibrotactile feedback on the frontal and back parts of the torso and shoulders as well. To capture essential movements from the VR performance, we propose a method called motion salient triangle (MST). MST utilizes key skeleton joints’ movements to compute the associated haptic parameters. Our method supports translating both choreographic and communicative motions into vibrotactile feedback. Through a series of user studies, we validate the user preference for our method compared to the conventional motion-to-tactile and audio-to-tactile methods.
Advisors
윤상호researcher
Description
한국과학기술원 :메타버스대학원,
Publisher
한국과학기술원
Issue Date
2024
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 메타버스대학원, 2024.8,[iv, 36 p. :]

Keywords

Haptics; Virtual Reality; Rendering Framework; Wearables; Performance; 햅틱 피드백; 가상 환경; 렌더링 프레임워크; 웨어러블; 퍼포먼스

URI
http://hdl.handle.net/10203/331878
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1110020&flag=dissertation
Appears in Collection
GCT-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0