Real-Time Brain-Machine Interface Achieves High-Velocity Prosthetic Finger Movements using a Biologically-Inspired Neural Network Decoder

2021 
Despite the rapid progress and interest in brain-machine interfaces that restore motor function, the performance of prosthetic fingers and limbs has yet to mimic native function. The algorithm that converts brain signals to a control signal for the prosthetic device is one of the limitations in achieving rapid and realistic finger movements. To achieve more realistic finger movements, we developed a shallow feed-forward neural network, loosely inspired by the biological neural pathway, to decode real-time two-degree-of-freedom finger movements. Using a two-step training method, a recalibrated feedback intention-trained (ReFIT) neural network achieved a higher throughput with higher finger velocities and more natural appearing finger movements than the ReFIT Kalman filter, which represents the current standard. The neural network decoders introduced herein are the first to demonstrate real-time decoding of continuous movements at a level superior to the current state-of-the-art and could provide a starting point to using neural networks for the development of more naturalistic brain-controlled prostheses.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    36
    References
    0
    Citations
    NaN
    KQI
    []