EEG Classification with Transformer-Based Models

2021 
Transformer has been widely used in the field of natural language processing (NLP) with its superior ability to handle long-range dependencies in comparison with convolutional neural network (CNN) and recurrent neural network (RNN). This correlation is also important for the recognition of time series signals, such as electroencephalogram (EEG). Currently, commonly used EEG classification models are CNN, RNN, deep believe network (DBN), and hybrid CNN. Transformer has not been used in EEG recognition. In this study, we constructed multiple Transformer-based models for motor imaginary (MI) EEG classification, and obtained superior performances in comparison with the previous state-of-art. We found that the activities of the motor cortex had a great contribution to classification in our model through visualization, and positional embedding (PE) method could improve classification accuracy. These results suggest that the attention mechanism of Transformer combined with CNN might be a powerful model for the recognition of sequence data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    1
    Citations
    NaN
    KQI
    []