Metaphoric Hand Gestures for Orientation-Aware VR Object Manipulation With an Egocentric Viewpoint

2017 
We present a novel natural user interface framework, called Meta-Gesture, for selecting and manipulating rotatable virtual reality (VR) objects in egocentric viewpoint. Meta-Gesture uses the gestures of holding and manipulating the tools of daily use. Specifically, the holding gesture is used to summon a virtual object into the palm, and the manipulating gesture to trigger the function of the summoned virtual tool. Our contributions are broadly threefold: 1) Meta-Gesture is the first to perform bare hand-gesture-based orientation-aware selection and manipulation of very small (nail-sized) VR objects, which has become possible by combining a stable 3-D palm pose estimator (publicly available) with the proposed static–dynamic (SD) gesture estimator; 2) the proposed novel SD random forest, as an SD gesture estimator can classify a 3-D static gesture and its action status hierarchically, in a single classifier; and 3) our novel voxel coding scheme, called layered shape pattern, which is configured by calculating the fill rate of point clouds (raw source of data) in each voxel on the top of the palm pose estimation, allows for dispensing with the need for preceding hand skeletal tracking or joint classification while defining a gesture. Experimental results show that the proposed method can deliver promising performance, even under frequent occlusions, during orientation-aware selection and manipulation of objects in VR space by wearing head-mounted display with an attached egocentric-depth camera (see the supplementary video available at: http://ieeexplore.ieee.org ).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    20
    Citations
    NaN
    KQI
    []