EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features

2022 
Abstract Emotion electroencephalography (EEG) datasets play a significant role in EEG-based emotion recognition research, providing a platform for comparisons of different emotion recognition methods. Most datasets used 2D images or videos as mood induction procedures (MIPs); however, considering the differences in EEG dynamics between 2D and 3D environments, experimental research based on 2D MIPs may have poor results being applied in a real 3D world. In this paper, we (1) developed a new emotion EEG dataset, virtual reality (VR) emotional EEG dataset (VREED), which used 3D VR videos as MIPs; and (2) Presented a baseline for the performance of negative/positive emotion classification in the new dataset. The best average accuracy of 73.77% ± 2.01% was obtained by using the combination of theta (4–8 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (30–49 Hz) relative power features. Additionally, we observed that occipital and frontal regions played a more critical role than other regions in emotion processing from Spearman correlation analysis and feature selection. This new VR emotion EEG dataset will be publicly available, and we encourage other researchers to evaluate their emotion classification methods on the VREED.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    87
    References
    0
    Citations
    NaN
    KQI
    []