Articulatory-Acoustic Analyses of Mandarin Words in Emotional Context Speech for Smart Campus

2018 
Recent years, along with the promotion of smart campus, social networks developed rapidly which demands high accuracy of man-man & man-machine interaction technologies. Thus, physiological information in speech interactive processing has become an important complement or even replaced acoustic-based features. With the aim of assessing the influence of emotions on articulatory-acoustic features in speech production, the current study explored the articulatory mechanism that underlying emotional speech production of Mandarin words. We first used the AG501 EMA device to collect articulatory and acoustic data synchronously as subjects were speaking specific words in Mandarin with different emotions, e.g., anger, sadness, happiness, and neutral; articulatory and acoustic features then were extracted from the collected data and analyzed in a one-way ANOVA to discover the significance of emotions on articulatory and acoustic features. The results illustrated that the motion of articulators (tongue and lip) were influenced by emotions significantly; in detail, the motion range of tongue and lip with anger were larger than other emotions, meanwhile, tongue speed and lip speed with anger and happiness were more sensitive than with sadness and neutral in emotional words. Results had been discussed to discover the relationship between acoustic and articulatory features of emotional speech, and then the conclusion can be acquired that articulatory motion feature (tongue and lip) may be the major feature of emotional speech recognition, so that which can be applied to the man-machine interaction of smart campus research in the future.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    4
    Citations
    NaN
    KQI
    []