iPand: Accurate Gesture Input with Ambient Acoustic Sensing on Hand

2018 
Finger gesture input is emerged as an increasingly popular means of human-computer interactions. In this paper, we propose iPand, an acoustic sensing system that enables finger gesture input on the skin, which is more convenient, user-friendly and always accessible. Unlike previous works, which implement gesture input with dedicated devices, our system exploits passive acoustic sensing to identify the gestures, e.g. swipe left, swipe right, pinch and spread. The insight of our system is that specific gesture emits unique friction sound, which can be captured by the microphone embedded in wearable devices. We capture these acoustic signals and extract the features by using bandpass filters and short-time Fourier Transform. The offline convolutional neural network is adopted to recognize the gestures. iPand is implemented and evaluated using COTS smartphones and smartwatches. Experiment results show that iPand can achieve the recognition accuracy of 89%, 83% and 78% in three daily scenarios (i.e., library, lab and cafe), respectively. Particularly, our system supports multi-touch function where 2–4 fingers are enabled for more efficient and expressive gesture input, and its average accuracy for individual finger gesture reaches up to 83% within 12 gestures.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []