Computational Analysis of Gaze Behavior in Autism During Interaction with Virtual Agents

2019 
Individuals with Autism spectrum disorder (ASD) are known to have significantly impaired social interaction and communication abilities. These impairments are characterized by their difficulties in using and perceiving non-verbal cues, such as facial expressions. The difficulty in processing communicators’ facial expressions is often attributed to the atypical gaze patterns in individuals with ASD. We present a computational study of gaze behavior in adolescents with ASD during their interaction with virtual agents (avatars) in a virtual reality based social communication platform. We study the implications on the subjects pupil response (pupil diameter changes) and looking pattern (fixation coordinates and duration) when exposed to the avatars demonstrating context-relevant emotional expressions. The data related to fixation and pupil response is collected using a commercial eye-tracker for subjects with and without ASD during their interactions with the avatars. This data is analyzed to investigate how the pupil response dynamics and fixation patterns of the ASD group differ from their typically developing peers. Our results indicate that communicators’ facial expressions can significantly affect the gaze behavior of the ASD subjects. We also observe reduced complexity in the pupil response dynamics, and lower synchrony between pupil response and fixation pattern in the ASD group.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    2
    Citations
    NaN
    KQI
    []