Hands-Free: a robot augmented reality teleoperation system
2020
In this paper the novel teleoperation method "Hands-Free" is presented. Hands-Free is a vision-based augmented reality system that allows users to teleoperate a robot end-effector with their hands in real time. The system leverages OpenPose neural network to detect the human operator hand in a given workspace, achieving an average inference time of 0.15 s. The user index position is extracted from the image and converted in real world coordinates to move the robot end-effector in a different workspace.The user hand skeleton is visualized in real-time moving in the actual robot workspace, allowing the user to teleoperate the robot intuitively, regardless of the differences between the user workspace and the robot workspace.Since a set of calibration procedures is involved to convert the index position to the robot end-effector position, we designed three experiments to determine the different errors introduced by conversion. A detailed explanation of the mathematical principles adopted in this work is provided in the paper.Finally, the proposed system has been developed using ROS and is publicly available at the following GitHub repository: https://github.com/Krissy93/hands-free-project.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
4
Citations
NaN
KQI