A Media-Guided Attentive Graphical Network for Personality Recognition Using Physiology

2021 
Physiological automatic personality recognition has been largely developed to model an individuals personality trait from a variety of signals. However, few studies have tackled the problems of integration methodology from multiple observations into a single personality prediction. In this study, we focus on finding a novel learning architecture to model the personality trait under a Many-to-One scenario. We propose to integrate not only the information on the user but also consider the effect of the affective multimedia stimulus. Specifically, we present a novel Acoustic-Visual Guided Attentive Graph Convolutional Network for enhanced personality recognition. The emotional multimedia content guides the formation of the physiological responses into a graph-like structure to integrate latent inter-correlation among all responses toward affective multimedia. Then these graphs would be further processed by the Graph Convolutional Network (GCN) to jointly model instances and inter-correlation levels of the subjects responses. We show that our model outperforms the current state of the art on two large public corpora for personality recognition. Further analysis reveals that there indeed exists a multimedia preference for inferring personality from physiology, and several frequency-domain descriptors in ECG and the tonic component in EDA are shown to be robust for automatic personality recognition.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []