DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Woojoo | ko |
dc.contributor.author | Xiong, Shuping | ko |
dc.date.accessioned | 2021-06-07T06:10:07Z | - |
dc.date.available | 2021-06-07T06:10:07Z | - |
dc.date.created | 2021-04-28 | - |
dc.date.created | 2021-04-28 | - |
dc.date.issued | 2021-08 | - |
dc.identifier.citation | INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.152, no.102648 | - |
dc.identifier.issn | 1071-5819 | - |
dc.identifier.uri | http://hdl.handle.net/10203/285553 | - |
dc.description.abstract | Locomotion is one of the fundamental interactions in VR. As a relatively easy and simple method to implement VR locomotion, walking-in-place (WIP) techniques have been actively developed, which have shown advantages in terms of spatial constraints and immersion compared to real walking or controller-based interaction. However, existing WIP gestures have been largely adapted or designed from the perspective of developers, not the users, which may result in not only higher cognitive load to learn and memorize, but also worse presence and increased sensory conflict in VR. Therefore, this study aims to elicit and evaluate WIP gestures for different walking directions (forward, sideways, and backward) from users so that a complete user-defined WIP gesture set for VR locomotion can be generated. Two sequential user studies were conducted. In Experiment 1, 20 participants experienced the movement while wearing the VR headset and elicited the gesture for each of 8 walking directions. The grouping analysis revealed that Turn body + Stepping-in-place (SIP) and Step one foot + SIP/Rock/ Stay were promising WIP gesture sets for VR locomotion. In the follow-up experiment (Experiment 2), 21 new participants experienced and compared the generated gesture sets and existing ones. The results showed that SIP performed the best for forward walking, while Step one foot + SIP/Rock/Stay were promising for other walking directions, which depended on application scenarios and the tradeoff between efficiency and naturalness. The generated WIP gesture sets from users can be used in VR applications to provide a better user experience and greater movement options in VR locomotion. | - |
dc.language | English | - |
dc.publisher | ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD | - |
dc.title | User-Defined Walking-in-Place Gestures for VR Locomotion | - |
dc.type | Article | - |
dc.identifier.wosid | 000652561000006 | - |
dc.identifier.scopusid | 2-s2.0-85104076071 | - |
dc.type.rims | ART | - |
dc.citation.volume | 152 | - |
dc.citation.issue | 102648 | - |
dc.citation.publicationname | INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES | - |
dc.identifier.doi | 10.1016/j.ijhcs.2021.102648 | - |
dc.contributor.localauthor | Xiong, Shuping | - |
dc.description.isOpenAccess | N | - |
dc.type.journalArticle | Article | - |
dc.subject.keywordAuthor | Walking-in-place | - |
dc.subject.keywordAuthor | Gesture elicitation | - |
dc.subject.keywordAuthor | VR locomotion | - |
dc.subject.keywordAuthor | User-defined | - |
dc.subject.keywordAuthor | Navigation control | - |
dc.subject.keywordAuthor | Virtual environment | - |
dc.subject.keywordPlus | PATTERNS | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.