MPnnet: a Motion Planning Decoding Convolutional Neural Network for EEG-based Brain Computer Interfaces

2021 
Being able to decode the subject's intention to move is still a major challenge in the field of Brain Computer Interfaces (BCI). Even more, decoding the intention to perform movements from the motor preparation phase is a still largely unexplored topic, as most of the efforts have been focused so far on motor imagery. The present paper deals with BCIs based on electroencephalography (EEG), the best candidate for future systems meant for widespread use, with the goal of decoding the preparation of hand open/close movement from the EEG recordings of the subject. To this end, a dataset of EEG signals recorded in the 1s frame preceding the onset of movement are extracted from a publicly available database. Epochs are properly pre-filtered between 0.5 and 32 Hz and labelled as pre-hand closing (HC), pre-hand opening (HO) or resting (RE) epochs. A system for motion planning decoding, based on a custom Convolutional Neural Network (CNN) and named “Motion Planning Neural Network” MPnnet, is designed, trained and tested over the constructed dataset, achieving a mean HC-RE and HO-RE accuracy of $90.77 \pm 5.56\%$ and of $92.48 \pm 4.3\%$ , respectively. MPnnet matched the performance of more complex systems proposed in the past, allowing to skip the inverse problem solution step and showing to be able to self-learn relevant features directly from scalp EEG signals.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    0
    Citations
    NaN
    KQI
    []