Anticipation in collaborative music performance using fuzzy systems: a case study
Thörn, Oscar, Fögel, Peter, Knudsen, Peter, de Miranda, Luis, Saffiotti, Alessandro
–arXiv.org Artificial Intelligence
The creation and performance of music has inspired AI researchers since the very early times of artificial intelligence [8, 13, 10], and there is today a rich literature of computational approaches to music [11], including AI systems for music composition [3] and improvisation [2]. As pointed out by Thom [15], however, these systems rarely focus on the spontanous interaction between the human and the artificial musicians. We claim that such interaction demands a combination of reactivity and anticipation, that is, the ability to act now based on a predictive model of the companion player [12]. This paper reports our initial steps in the generation of collaborative human-machine music performance, as a special case of the more general problem of anticipation and creative processes in mixed human-robot, or anthrobotic systems [4]. We consider a simple case study of a duo consisting of a human pianist accompained by an off-the-shelf virtual drummer, and we design an AI system to control the key perfomance parameters of the virtual drummer (patterns, intensity, complexity, fills, and so on) as a function of what the human pianist is playing. The AI system is knowledge-based: it relies on an internal model represented by means of fuzzy logic.
arXiv.org Artificial Intelligence
Jun-5-2019