(DYCI2) Creative Agents, Improvised Interactions, and “Meta-Composition”
The team explores the paradigm of computational creativity using devices inspired by artificial intelligence, in the context of new symbolic musician-machine interactions or in the context of data science and knowledge extraction.
In the tradition of the OMax software program, research on learning and interactive music generation has led to the creation of several paradigms of musician-machine interaction characterized by an architecture combining artificial listening to the signal, discovery of a symbolic vocabulary, statistical learning of a sequence model and generation of new musical sequences by reactive and/or planning (scenario) mechanisms.
The ANR DYCI2 project (2015-2018) has given birth to DYCI2lib, a library containing a collection of generative agents and tools for smart composition and human-machine co-improvisation. These agents combine machine learning models and generative processes with reactive listening modules. This library offers a collection of “agents/instruments” embedding free, planned and reactive approaches to corpus-based generation, as well as models of short-term dynamic scenarios (.meta-Djing.).
This axis of research emphasizes improvised interaction, as an anthropological and cognitive model of action and decision, as a scheme of discovery and unsupervised learning, and as a discursive tool for human - digital artifact exchange, in a style and interaction modeling perspective.
The objective is to constitute autonomous creative agents through direct learning resulting from exposure to the live performance of improvising human musicians, by creating a loop of stylistic feedback through the simultaneous exposure of humans to the improvised productions of the digital artifacts themselves, thus starting from a situation of humanartefact communication evolving in a complex dynamic of co-creativity. These agents/instruments are also a part of a study on new generative paradigms .informed by AI., working, for example, on the association of generative mechanisms with modules for the extraction and inference of a harmonic grill in real-time.
After having been the foundation of large-scale productions that were validated by expert musicians (Pascal Dusapin, Bernard Lubat, Steve Lehman, R.mi Fox, Herv. Sellin, etc.) as well as workshops and festivals (Festival Improtech Paris-Athina and Paris-Philly; MassMoca, Cycling ‘74, etc.), the DYCI2 themes continue with the launch of the ANR project MERCI (Mixed Musical Reality with Creative Instruments) project coordinated by G. Assayag with the EHESS and the startup HyVibe with the support of the ERC Advanced Grant REACH led by G. Assayag. Setting up powerful and realistic human-machine environments for improvisation requires going beyond the software engineering of creative agents with listening and audio signal generation capabilities. This line of research proposes to radically alter the paradigm of improvised human-computer interaction by establishing a continuum from co-creative musical logic
to a form of .physical interreality. (a mixed reality scheme where the physical world is actively modified) anchored in acoustic instruments.
IRCAM's Team : Musical Representations