Eyes-free Target Acquisition During Walking in Immersive Mixed Reality.

2020 
Reaching towards out-of-sight objects during walking is a common task in daily life, however the same task can be challenging when wearing immersive Head-Mounted Displays (HMD). In this paper, we investigate the effects of spatial reference frame, walking path curvature, and target placement relative to the body on user performance of manually acquiring out-of-sight targets located around their bodies, as they walk in a spatial-mapping Mixed Reality (MR) environment wearing an immersive HMD. We found that walking and increased path curvature negatively affected the overall spatial accuracy of the performance, and that the performance benefited more from using the torso as the reference frame than the head. We also found that targets placed at maximum reaching distance yielded less error in angular rotation and depth of the reaching arm. We discuss our findings with regard to human walking kinesthetics and the sensory integration in the peripersonal space during locomotion in immersive MR. We provide design guidelines for future immersive MR experience featuring spatial mapping and full-body motion tracking to provide better embodied experience.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    47
    References
    4
    Citations
    NaN
    KQI
    []