Perceptually motivated automatic dance motion generation for music

2009 
In this paper, we describe a novel method to automatically generate synchronized dance motion that is perceptually matched to a given musical piece. The proposed method extracts 30 musical features from musical data as well as 37 motion features from motion data. A matching process is then performed between the two feature spaces considering the correspondence of the relative changes in both feature spaces and the correlations between musical and motion features. Similarity matrices are introduced to match the amount of relative changes in both feature spaces and correlation coefficients are used to establish the correlations between musical features and motion features by measuring the strength of correlation between each pair of the musical and motion features. By doing this, the progressions of musical and dance motion patterns and perceptual changes between two consecutive musical and motion segments are matched. To demonstrate the effectiveness of the proposed approach, we designed and carried out a user opinion study to assess the perceived quality of the proposed approach. The statistical analysis of the user study results showed that the proposed approach generated results that were significantly better than those produced using a random walk through the dance motion database. Copyright © 2009 John Wiley & Sons, Ltd.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []