Tactile-Based Object Recognition Using a Grasp-Centric Exploration

2021 
As humans, our grasping and manipulation skills are highly dependent on our ability to perceive tactile properties. Conversely, most of today's robotic operations still rely predominantly on visual feedback for identifying the objects that need to be grasped and handled. In this work, we study the problem of recognizing everyday objects based solely on their tactile attributes. This has a significant practical value, as it could allow object identification even when visual sensing is impossible, or assist vision in difficult contexts. Our method consists of acquiring multi-modal tactile sensing data during a quick grasp-centric exploration phase, with minimal operational cost. Our algorithm was able to recognize objects from a considerably-large set of 50 general purpose items with an accuracy of 98.1%. Moreover, we show that it is possible to reliably identify a large proportion of these objects by only analyzing the deformation pattern that they undergo during compression. Further, we study our method's ability to learn relevant tactile properties to classify new objects. We also share our tactile sensing database that contains various sensor data acquired from more than 1600 experiments, which was used for this work. Finally, we discuss the relative performance and role of each tactile modality for differentiating objects.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    0
    Citations
    NaN
    KQI
    []