Efficient 3D hand tracking in articulation subspaces for the manipulation of virtual objects

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 386
  • Download : 0
We propose an efficient method for model-based 3D tracking of hand articulations observed from an egocentric viewpoint that aims at supporting the manipulation of virtual objects. Previous modelbased approaches optimize non-convex objective functions defined in the 26 Degrees of Freedom (DoFs) space of pobible hand articulations. In our work, we decompose this space into six articulation subspaces (6 DoFs for the palm and 4 DoFs for each finger). We also label each finger with a Gaubian model that is propagated between succebive image frames. As confirmed by a number of experiments, this divide-and-conquer approach tracks hand articulations more accurately than existing model-based approaches. At the same time, real time performance is achieved without the need of GPGPU procebing. Additional experiments show that the proposed approach is preferable for supporting the accurate manipulation of virtual objects in VR/AR scenarios.
Publisher
Association for Computing Machinery
Issue Date
2016-06-29
Language
English
Citation

33rd Computer Graphics International Conference, CGI 2016, pp.33 - 36

DOI
10.1145/2949035.2949044
URI
http://hdl.handle.net/10203/224446
Appears in Collection
GCT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0