Improvisation is a major driver in all aspects of communication and action in human interaction. In its highest form, musical improvisation is a combination of structured, planned actions and unpredictable localized decisions and differences, optimizing the adaptation to the context, expressing the subject's creativity and stimulating coordination and cooperation between agents. The implementation of powerful and realistic human-machine environments for improvisation requires going beyond the software engineering of creative agents with the capacity to listen and generate audio signals. This project proposes to fundamentally renew the paradigm of improvised man-machine interaction by establishing a true continuum from co-creative musical logics to a form of "physical interreality" (a mixed reality scheme in which the physical world itself is actively modified) anchored in acoustic instruments.
Musician Camel Zekri with his augmented instrument - photo: Jeff Joly @ popmyfilm
The main objective of this project is to create the scientific and technological conditions for the emergence of mixed reality musical systems, allowing improvised human-machine interactions based on the interrelation of creative digital agents and the active acoustic control of musical instruments. We call these mixed reality devices creative instruments. The functional integration of creative artificial intelligence and active acoustic control into the organological core of the musical instrument, in order to promote plausible situations of physical interreality, requires the synergy of highly interdisciplinary public and private research, such as that provided by the partners. This evolution is likely to disrupt artistic and social practices and, in the long run, have a powerful impact on the music industry as well as on amateur and professional musical practices.
IRCAM team: Musical Representations