After studying with Pierre Jodlowski (IRCAM Cursus 2021-2022), Jean-Luc Hervé and Roque Rivas (Pôle Supérieur Paris-Boulogne-Billancourt), and Martin Matalon (CRR93), composer Basile Chassaing has spent the last five years working on how to incorporate gesture into his compositions. In fact, it's this very link to gesture that led him to the world of contemporary music writing.
Basile Chassaing in 2021 at IRCAM
His original training was in jazz improvisation. Then, in 2009, he discovered Toucher by Vincent-Raphaël Carinola, a piece for thereminvox, computer, and 6-channel sound diffusion system. Invented in the 1920s by the Russian Leon Theremin, the theremin is a unique instrument consisting of two antennae which, through variations in the magnetic field they produce, pick up the movement of bodies around them (by capacitance effect). The interference they generate produces and modifies the pitch and intensity of an electric sound with a timbre that is instantly recognizable, especially to science-fiction film fans.
"What's interesting about Toucher is that the composer uses the instrument primarily as a gesture sensor to trigger and manipulate sounds. The gestural discourse and the musical intent are intimately linked and form part of the same dramaturgy. Toucher helped to reconcile me with electroacoustic music by reintegrating the corporality of the musical gesture," recalls Chassaing, "as well as with electroacoustic music in the broadest sense. Ten years later, in 2018, I composed my first piece for motion sensors, laps. Since then, the subject has continued to challenge me."
laps performed by Yi-hsuan Chen
This is how he came up with the idea for his artistic research residency at IRCAM: combining motion capture with audio synthesis processes in real-time. Or, to put it in more concrete terms, how to turn a dancer's body into an instrument for interpreting an electronic score.
"The principle is to compose the space in the manner of a pseudo-instrumental ecosystem. In the same way that a percussionist moves from one instrument to another in his set to play them during a concert, the dancer can play various virtual instruments."
"The space becomes an instrument: precise gestures trigger or modulate electronic sounds."
"I call it an ecosystem because I want to create a back-and-forth between the dancer's gestures and the sound production system, with the virtual instruments evolving over time. I imagine this environment to be flexible, semi-open and evolving."
The basic setup combines capturing gestures and mapping these gestures to various synthesis or sound modulation processes. Said mappings can be either explicit (such a gesture produces an autonomous musical process) or implicit, but above all they can evolve over time to build a rich musical narrative.
The first line of research concerns sensors. Basile Chassaing's starting point is a well-known and successful device developed by IRCAM's Sound Music Movement Interaction team: R-Io-T sensors (combining accelerometers and gyroscopes, they detect 3D movements on 9 axes).
"I'm keeping the R-Io-Ts because I'm trying to limit the number of tools in order to simplify the logistics and performance of the works. However, I'd like to add a system with buttons so that the dancer can trigger certain synthesis or transformation processes independently, and control them in a refined and autonomous capacity, like a performer. Emmanuel Fléty, electronics engineer at IRCAM and head of the Engineering and Prototype Department, is currently working on a new version of these sensors."
Working with mapping
Work on the mapping is twofold. On the one hand, it involves a series of collaborative sessions with choreographer Emmanuelle Grach to establish a common gestural language that will then be used to control the electronic processes. This stage concerns both gesture descriptors (and the efficiency of their capture) and sound descriptors, the aim being to interpret the score with a certain virtuosity (choreographically and musically).
The other part of the research concerns mapping itself: from sensor data recovery to sound production. "Much of the work will be carried out using MUBU and especially CataRT, a concatenative synthesis tool developed by the ISMM team."
"The challenge here will be to transpose the performance space to the 2D space of the CataRT environment. My initial ideas focus on notions that cut across the two disciplines of dance and music: tempo, energy and attacks."
"It's easy enough to create mappings," says Chassaing. But we'll certainly find other avenues as we experiment!"
Artificial Intelligence gets involved
Artificial intelligence is a second phase. Here again, the experimental research hypotheses are twofold. AI will, on the one hand, reinforce the coherence and predictability of the device, but also, on the other hand, give it a form of creativity of its own, in interaction with the dancer.
In the first case, intensive machine learning sessions will be required to refine gesture recognition. The idea is that, as with any other musical instrument, the same gesture always produces the same sound.
The second case is more speculative. "As it stands," says Chassaing, "the computer reacts to the dancer's gestures, but doesn't propose any content outside this framework. In a performance situation, I'd like the computer to be able to move away from this direct gesture/sound association to generate another type of discourse, to which the dancer can react. The ideal tool for this would be Dyci2, the software dedicated to improvised interaction developed by Jérôme Nika in the Musical Representations team. Dyci2 could be the spanner in the works, the "S" in "SPA(S)M". I'm more interested in what AI does wrong than what it does right!"
Photo 2: Capteur RIoT © IRCAM-Centre Pompidou, photo: Philippe Barbosa
In future episodes, we'll be meeting Emmanuel Fléty, electronics engineer at IRCAM and Emmanuelle Grach, choreographer and collaborator of Basile Chassaing.