Beyond the Touchscreen: An Exploration of Extending Interactions on Commodity Smartphones

2016 
Most smartphones today have a rich set of sensors that could be used to infer input (e.g., accelerometer, gyroscope, microphone); however, the primary mode of interaction is still limited to the front-facing touchscreen and several physical buttons on the case. To investigate the potential opportunities for interactions supported by built-in sensors, we present the implementation and evaluation of BeyondTouch, a family of interactions to extend and enrich the input experience of a smartphone. Using only existing sensing capabilities on a commodity smartphone, we offer the user a wide variety of additional inputs on the case and the surface adjacent to the smartphone. Although most of these interactions are implemented with machine learning methods, compact and robust rule-based detection methods can also be applied for recognizing some interactions by analyzing physical characteristics of tapping events on the phone. This article is an extended version of Zhang et al. [2015], which solely covered gestures implemented by machine learning methods. We extended our previous work by adding gestures implemented with rule-based methods, which works well with different users across devices without collecting any training data. We outline the implementation of both machine learning and rule-based methods for these interaction techniques and demonstrate empirical evidence of their effectiveness and usability. We also discuss the practicality of BeyondTouch for a variety of application scenarios and compare the two different implementation methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    11
    Citations
    NaN
    KQI
    []