Research
News

REACH 2/4 : The Other Voice of AI

The Artistic Residency Blog

In the age of generative artificial intelligences - ChatGPT, Midjourney, etc. - the REACH project led by Gérard Assayag's team at IRCAM might seem like just one of many. In truth, it differs from them in many respects, from its time frame, its mode(s) of operation, and even its intended goal.

For, although it starts in 2021, REACH capitalizes on all the Musical Representations team's previous projects involving "symbolic interaction", initiated with the first human-machine improvisation software, the famous OMax, in the early 2000s, taking them to a new dimension.

From left to right: Gérard Assayag, Marco Fiorini, Horse Lords and Michael Malt at the Reaching Out! concert as part of ManiFeste 2023 © Hervé Véronèse

OMax can be regarded as an "AI" before its time: "The principle of AI is to simulate the higher functions of the human mind, and OMax 'simulates' an improviser," notes Gérard Assayag. What's more, while it doesn't resort to deep learning and other algorithms such as those developed by the start-ups the media have been talking about incessantly these days, OMax foreshadowed them, twenty years ahead of its time, since its principle was to learn the stylistic signature of a given body of music.

"The idea is as follows," explains Gérard Assayag, "musical style is like a territory. In classical music, for example, the different tonalities are like different landscapes, the rhythm like uneven terrain, and so on. When a musician plays, or improvises, he takes a path through this landscape and, in so doing, reveals a part of it, albeit necessarily a limited one. Based on the paths taken by the musician, OMax will attempt to map the potential entirety of the territory traversed. This is how OMax works: from one instance among thousands, it infers the style, or rather the overall structure that gives the idea of style. Then it plays. In other words, it takes its turn walking around the map it has drawn. Which explains why what it plays resembles what it has been fed, forming as it does new, coherent variations."

Mikhaïl Malt at the Reaching Out! concert as part of ManiFeste 2023 © Hervé Véronèse

As for the intended purpose, what is expected of software developed within the REACH framework is not to generate content indifferently or according to a more or less precise request made by the user (known as a "prompt"). "That's not what we're about," insists Gérard Assayag."We're not interested in a machine that is creative on its own... whatever that means! But rather its interactions with musicians. The notion of co-creativity must therefore be seen as a phenomenon of 'emergence' in the complex system formed by the musician and the machine, a system in which each listens, constantly learns from the other and develops an adaptive path. In this way, new musical forms emerge, produced jointly in a double feedback loop, which can be reduced neither to the action of the one nor to that of the other, but exist only in the fleeting being of the interaction. A whole new universe is thus opened up to the imagination, and the results we arrive at are undoubtedly 'creations'."

This research seems to call for us to move away from fixed postures and return to a reflection on the very term "creativity" as it is used in the debate on artificial intelligence. A reflection that is at least as much political as aesthetic...

In an upcoming episode, we'll meet Mikhail Malt...