EEG-based Emotion Recognition Using Spatial-Temporal Representation via Bi-GRU

2020 
Many prior studies on EEG-based emotion recognition did not consider the spatial-temporal relationships among brain regions and across time. In this paper, we propose a Regionally-Operated Domain Adversarial Network (RODAN), to learn spatial-temporal relationships that correlate between brain regions and time. Moreover, we incorporate the attention mechanism to enable cross-domain learning to capture both spatial-temporal relationships among the EEG electrodes and an adversarial mechanism to reduce the domain shift in EEG signals. To evaluate the performance of RODAN, we conduct subject-dependent, subject-independent, and subject-biased experiments on both DEAP and SEED-IV data sets, which yield encouraging results. In addition, we also discuss the biased sampling issue often observed in EEG-based emotion recognition and present an unbiased benchmark for both DEAP and SEED-IV.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    4
    Citations
    NaN
    KQI
    []