What is That in Your Hand?: Recognizing Grasped Objects via Forearm Electromyography Sensing

2018 
Knowing the object in hand can offer essential contextual information revealing a user's fine-grained activities. In this paper, we investigate the feasibility, accuracy, and robustness of recognizing the uninstrumented object in a user's hand by sensing and decoding her forearm muscular activities via off-the-shelf electromyography (EMG) sensors. We present results from three studies to advance our fundamental understanding of the opportunities that EMG brings in object interaction recognition. In the first study, we investigated the influence of physical properties of objects such as shape, size, and weight on EMG signals. We also conducted a thorough exploration of the feature spaces and sensor positions which can provide a solid base to rely on for future designers and practitioners for such interactive technique. In the second study, we assessed the feasibility and accuracy of inferring the types of grasped objects via using forearm muscular activity as a cue. Our results indicate that the types of objects can be recognized with up to 94.2% accuracy by employing user-dependent training. In the third study, we investigated the robustness of this approach in a realistic office setting where users were allowed to interact with objects as they would naturally. Our approach achieved up to 82.5% accuracy in discriminating 15 types of objects, even when training and testing phrases were purposefully performed on different days to incorporate changes in EMG patterns over time. Overall, this work contributes a set of fundamental findings and guidelines on using EMG technologies for object-based activity tracking.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    43
    References
    6
    Citations
    NaN
    KQI
    []