Learning human–environment interactions using conformal tactile textiles

2021 
Recording, modelling and understanding tactile interactions is important in the study of human behaviour and in the development of applications in healthcare and robotics. However, such studies remain challenging because existing wearable sensory interfaces are limited in terms of performance, flexibility, scalability and cost. Here, we report a textile-based tactile learning platform that can be used to record, monitor and learn human–environment interactions. The tactile textiles are created via digital machine knitting of inexpensive piezoresistive fibres, and can conform to arbitrary three-dimensional geometries. To ensure that our system is robust against variations in individual sensors, we use machine learning techniques for sensing correction and calibration. Using the platform, we capture diverse human–environment interactions (more than a million tactile frames) and show that the artificial-intelligence-powered sensing textiles can classify humans’ sitting poses, motions and other interactions with the environment. We also show that the platform can recover dynamic whole-body poses, reveal environmental spatial information and discover biomechanical signatures. Large-scale sensing textiles that can conform to arbitrary three-dimensional geometries and are created through digital machine knitting of piezoresistive fibres can be used to record, monitor and learn human–environment interactions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    16
    Citations
    NaN
    KQI
    []