A bioelectric neural interface towards intuitive prosthetic control for amputees

2020 
Objective While prosthetic hands with independently actuated digits have become commercially available, state-of-the-art human-machine interfaces (HMI) only permit control over a limited set of grasp patterns, which does not enable amputees to experience sufficient improvement in their daily activities to make an active prosthesis useful. Approach Here we present a technology platform combining fully-integrated bioelectronics, implantable intrafascicular microelectrodes and deep learning-based artificial intelligence (AI) to facilitate this missing bridge by tapping into the intricate motor control signals of peripheral nerves. The bioelectric neural interface includes an ultra-low-noise neural recording system to sense electroneurography (ENG) signals from microelectrode arrays implanted in the residual nerves, and AI models employing the recurrent neural network (RNN) architecture to decode the subject's motor intention. Main results A pilot human study has been carried out on a transradial amputee. We demonstrate that the information channel established by the proposed neural interface is sufficient to provide high accuracy control of a prosthetic hand up to 15 degrees of freedom (DOF). The interface is intuitive as it directly maps complex prosthesis movements to the patient's true intention. Significance Our study layouts the foundation towards not only a robust and dexterous control strategy for modern neuroprostheses at a near-natural level approaching that of the able hand, but also an intuitive conduit for connecting human minds and machines through the peripheral neural pathways. (Clinical trial identifier: NCT02994160).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []