Improvised Musical Interactions - OMax & Co
This project focuses on the development of improvised man-machine musical interactions. A new paradigm for interaction was invented at IRCAM and has been made available to the general public via the OMax software program.
Using machine learning techniques and formal languages, OMax learns in an unsupervised way from either a MIDI or an audio stream produced by a musician. The underlying process behind this interaction could be called "stylistic reinjection". The musician is continually kept informed by several sources providing complex feedback. He hears himself play, he listens to others while memorizing sound images that flow from the present towards the past. Using medium-term and long-term memory, these motifs, combined with even older images taken from the repertoire or musical culture, for example, can return to the present after undergoing several transformations, including one of the most common transformations in improvisation: formal recombination. OMax models this memory-based process and makes it possible to "reify" it, to make it heard on stage. It then re-injects musical figures taken from its short-term and long-term memory and reconstructs them in a manner that is both similar and innovative, providing the musician with stimuli that are familiar and stimulating.
This project has led to two new research projects: SoMax that explores the immediate reactivity of an artificial agent to its sound environment and Improtek (in collaboration with the EHESS) that explores the notion of guided improvisation in the framework of a specific scenario (e.g. a chord chart). The vocation of the ANR-funded project DyCI2 is to study the synthesis of these two different approaches with innovative artificial listening techniques.
IRCAM's Team: Music Representations team.