WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception

Cited 39 time in webofscience Cited 0 time in scopus
  • Hit : 28
  • Download : 0
We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn display and a moving bare hand in 3D space by exploiting depth input data. Depth perception is enhanced through egocentric visual feedback, including a semi-transparent proxy hand. We implement a virtual hand interaction technique and feedback approaches, and evaluate their performance and usability. The proposed method can apply to many 3D interaction scenarios using hands in a wearable AR environment, such as AR information browsing, maintenance, design, and games.
Publisher
IEEE Computer Society Visualization and Graphics Technical Committee
Issue Date
2014-09
Language
English
Citation

13th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2014, pp.219 - 228

ISSN
1554-7868
DOI
10.1109/ISMAR.2014.6948431
URI
http://hdl.handle.net/10203/314735
Appears in Collection
GCT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 39 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0