Learning Data Representation and Emotion Assessment from Physiological Data

2020 
Aiming a deeper understanding of human emotional states, we explore deep learning techniques for the analysis of physiological data. In this work, raw two-channel pre-frontal electroencephalography and photoplethysmography signals of 25 subjects were collected using EMOTAI’s headband while watching commercials. Taking as input the raw data, convolutional neural networks were used to learn data representations and classify the acquired signals according to the Positive and Negative Affect Schedule. This approach achieved promising results, with average F1-scores of 76.6% for Positive Affect and 83.3% for Negative Affect. Interpretation of the learned data representation was attempted by computing correlation values between features extracted from the raw inputs and the final classification. The features with the most significant correlations were the alpha band power, and the asymmetry and phase synchronization indexes. The extracted features seem to match the ones learnt by the neural network, hence endorsing their validity for emotional studies.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    98
    References
    0
    Citations
    NaN
    KQI
    []