Control of Bio-Inspired Multi-robots Through Gestures Using Convolutional Neural Networks in Simulated Environment

2021 
In this paper the comparison between three convolutional neural networks, used for the control of bio-inspired multi-robots in a simulated environment, is performed through manual gestures captured in real time by a webcam. The neural networks are: VGG19, GoogLeNet and Alexnet. For the training of networks and control of robots, six gestures were used, each gesture corresponding to one action, collective and individual actions were defined, the simulation contains four bio-inspired robots. In this work the performance of the networks in the classification of gestures to control robots is compared. They proved to be efficient in the classification and control of agents, with Alexnet achieving an accuracy of 98.33%, VGG19 98.06% e Googlelenet 96.94% .
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    0
    Citations
    NaN
    KQI
    []