Enhancing hand force estimation in human-computer interface via integration of sEMG and vision modalitiessEMG 및 비전 모달리티 통합을 통한 인간-컴퓨터 인터페이스에서의 손 힘 추정 향상

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 140
  • Download : 0
Accurately estimating hand pressure is crucial for enabling precise and natural interactions in force-based human-computer interfaces. Traditional methods using force sensors embedded in gloves or load cells hinder natural hand movements and reduce user comfort. Non-invasive techniques like forearm-worn surface electromyography (sEMG) sensors offer a less intrusive alternative but face challenges due to the complexity of muscle activation patterns. This dissertation introduces PiMForce, a novel framework that enhances hand pressure estimation by integrating 3D hand posture information with forearm sEMG signals. By combining these two modalities, PiMForce disambiguates similar muscle activation patterns occurring across different hand postures and interaction scenarios. A multimodal hand data collection system was developed to capture synchronized data of hand posture, sEMG signals, and exerted hand pressure across various interactions. Using this dataset, a deep learning model was trained to predict whole-hand pressure distributions based on the integrated inputs. Experiments demonstrate that PiMForce significantly improves pressure estimation accuracy compared to traditional sEMG-based and vision-based methods, achieving an R2 value of 88.86% and an NRMSE of 6.65%. The framework also generalizes well to unseen users and operates without the need for a pressure glove during inference by utilizing off-the-shelf hand pose detectors. The contributions of this work include the introduction of the PiMForce framework, the development of a comprehensive multimodal dataset, and extensive experiments showcasing the effectiveness of the proposed approach. The findings suggest that integrating 3D hand posture information with sEMG signals enables accurate and robust hand pressure estimation, facilitating more natural and intuitive force-aware interactions in human-computer interfaces
Advisors
Yoon, Sang Horesearcher윤상호researcher
Description
한국과학기술원 :문화기술대학원,
Publisher
한국과학기술원
Issue Date
2025
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 문화기술대학원, 2025.2,[v, 47 p. :]

Keywords

Hand force estimation; surface electromyography (sEMG); vision and tracking data; multimodal; 손 힘 추정; 표면 근전도; 비전 및 트래킹 데이터; 멀티 모달

URI
http://hdl.handle.net/10203/332641
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1122494&flag=dissertation
Appears in Collection
GCT-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0