Integrating wearable muscle sensing for hand-based interactions in mixed reality혼합 현실 손 기반 상호작용을 위한 착용형 근육 센싱 결합

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 3
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisor우운택-
dc.contributor.authorKim, Hyung-il-
dc.contributor.author김형일-
dc.date.accessioned2024-07-26T19:30:27Z-
dc.date.available2024-07-26T19:30:27Z-
dc.date.issued2023-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1046598&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/320827-
dc.description학위논문(박사) - 한국과학기술원 : 문화기술대학원, 2023.8,[vi, 68 p. :]-
dc.description.abstractThis dissertation proposes a method for utilizing wearable muscle sensing technology to recognize users' hand force and touch information in the context of mixed reality (MR) environments and to enhance hand-based interactions and collaborations. Although mixed reality technologies, encompassing augmented reality (AR) and virtual reality (VR), have advanced, hand-based interactions for real-virtual 3D interactions heavily rely on visual information provided by head-mounted displays, leading to limitations that hinder natural and effective interactions. To address this, a comprehensive framework is proposed in this dissertation for integrating wearable muscle sensing technology into MR interactions and collaborations, considering the user's posture, interacting objects, and sensor signals. Furthermore, a system is developed that applies wearable electromyography (EMG) sensors to measure the hand force of a worker in an MR remote collaboration system. The impact of visualizing the worker's hand force on the collaborator's task awareness is evaluated through user studies. The visualization of hand force data positively influences task comprehension, force perception, object weight estimation, and the collaborator's sense of social presence. Moreover, this dissertation proposes a system that utilizes EMG sensors to estimate accurate touch events and touch intensity in MR interactions. The system demonstrated the capability to enable touch interactions that utilize force information, leveraging all fingers for precise and force-sensitive touch within mixed reality. In conclusion, the research findings contribute to the improvement of MR interactions and collaborations by utilizing muscle sensing technology to provide additional force information and accurate hand gesture recognition, alongside users' hand information, in mixed reality interaction and collaboration scenarios.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subject증강현실▼a가상현실▼a혼합현실▼a인간-컴퓨터 상호작용▼a웨어러블 컴퓨팅-
dc.subjectAugmented reality▼aVirtual reality▼aMixed reality▼aHuman-computer interaciton▼aWearable computing-
dc.titleIntegrating wearable muscle sensing for hand-based interactions in mixed reality-
dc.title.alternative혼합 현실 손 기반 상호작용을 위한 착용형 근육 센싱 결합-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :문화기술대학원,-
dc.contributor.alternativeauthorWoo, Woontack-
Appears in Collection
GCT-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0