Collaborating with an Autonomous Agent to Generate Affective Music

2016 
Multidisciplinary research recently has been investigating solutions to offer new experiences of music making to musically untrained users. Our approach proposes to distribute the process of music making between the user and an autonomous agent by encoding this collaboration in the emotional domain. In this framework, users communicate the emotions they wish to express to Robin, the autonomous agent, which interprets this information to generate music with matching affective flavor. Robin is taught a series of basic compositional rules of tonal music, which are used to create original compositions in Western classical-like music. Associations between alterations to musical factors and changes in the communicated emotions are operationalized on the basis of recent outcomes that have emerged from research in the field of psychology of music. At each new bar, a number of stochastic processes determine the values of seven musical factors, whose combinations best match the intended emotion. The ability of Robin to validly communicate emotions was tested in an experimental study (N = 33). Results indicated that listeners correctly identified the intended emotions. Robin was employed for the purposes of two interactive artworks, which are also discussed in the article, showing the potential of the algorithm to be employed in interactive installations.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    55
    References
    9
    Citations
    NaN
    KQI
    []