Creation
News

REACH /4 3 : Mikhail Malt or The Incarnation of the Ghost in the Machine

The Artistic Residency Blog

Mikhail Malt has been a pillar of IRCAM for over 30 years. While he's best known for his role as a computer music designer/professor, his skills go far beyond his teaching role. A year and a half from retirement, he has returned to his first love: research, but also musical interpretation and composition. All of this within the framework of REACH, a project to which he adds an aspect previously neglected by computer music:

Embodiment.

Mikhail Malt

You have spent most of your career at IRCAM in the Education Department, but you are first and foremost a researcher and a musician.

My background is twofold. In Brazil, where I grew up, I attended the São Paulo Polytechnic School, where I studied chemical engineering. I then studied composition and orchestral conducting. The first time I came to IRCAM was in 1990, to follow a doctoral course in twentieth-century music and musicology set up by Hugues Dufourt with the Ecole Normale Supérieure - an unforgettable experience, for me as for many others. I went on to work at IRCAM, first as a doctoral student and then, at the end of 1991, Gérard Assayag (already!) offered me a contract as a research composer in the Representations musicales team, which I still work with today! My field was mathematical models and computer-aided composition (CAC). I actually went on to do a post-graduate diploma on representation in the context of CAC and computational musicology (computer-assisted analysis).
I never lost touch with research, even when I was busy with my other functions. What's more, my activities in the Education Department required me to keep abreast of the new tools developed by the various teams, including those of the Musical Representations team. I was in constant dialogue with the researchers.
So Gérard Assayag's offer to join the team on a full-time basis as part of REACH appealed to me immediately - it was like coming full circle.

Why did he reach out to you?

I think he invited me because of my expertise in generative algorithms. Given my experience as a composer and computer music designer, my role is more concerned with the artistic applications of these tools. In truth, being new to the team, it was a long road for me to master these instruments. I knew a little about Somax, but as I delved deeper, a whole series of questions popped up. The tool itself, this sound-generating object, raised a number of questions, and not the smallest. Whether it's "intelligent" or not, the fact is that it knows how to wander around a piece of music on its own. But what struck me even more was that, with Somax, time seemed to have been abolished: the whole piece is contained in a single instant.
Another aspect that really got me thinking was the way Somax reacts to external stimuli and, even more so, the way the manipulator interacts with this object which itself interacts with these external stimuli. The fairly static posture of setting parameters and waiting for reactions was unsatisfactory for me.
The final and equally important issue is composition. Within the framework of idiomatic improvisation (i.e. within a given style, generally provided by the musical corpus from which it is fed, which is the primary aim of Somax), the generation of the various aspects of musical structure (melody, bass, rhythm...) occurs quite naturally. But as soon as you break out of the idiomatic framework, the question of composition (and therefore of the musical environment in which the agent evolves), or even interpretation, is added to that of improvisation. So how do we manage this complexity?
That's how I came to see the need to instrumentalize the tool: the ability to manage it effectively via gestures became a priority for me.

Was embodiment, in terms of the interpretation of a discourse derived from computer music—which is an essential issue today, both for the readability of the discourse by the audience and for its spectacular or simply expressive aspects—already a preoccupation before you joined the Musical Representations team, as part of your computer music design activities or within the Education department?

My musical life didn't begin at IRCAM. Before moving to France about 30 years ago, I had experience as a musician in Brazil, as a flautist and conductor of choirs and youth orchestras. The embodiment of the instrumental gesture, this direct control of sound, was very important to me at the time. Then, as most of my research and activities concerned computer-aided composition (CAC) and sound synthesis, as well as composition itself (which in turn fed my research), the question of embodiment remained for me a little in the background, in the realm of envy. CAC is basically a matter of timeless algorithms that produce material. I have, however, created living systems, for example for real-time installations animated by virtual agents - but without direct human action on sound production, or via two or three controllers.

When the need arose, how did you approach the problem?

The first hurdle to overcome was to identify which of the Somax agent control parameters were relevant for real-time management in terms of discourse generation and interaction with the improviser - remembering that the Somax controller is only an intermediary.
Of the hundred or so parameters in total, I identified 60 to 70 - which is still a lot! It was therefore necessary to make an effort to synthesize them. I carried out a number of tests, grouping certain parameters so as to manage them all together: a single instrumental gesture can thus potentially modify a bunch of parameters in parallel… on different scales, of course.

All that remained was to determine which instrumental gestures, of what nature, and, above all, how to translate them into computer terms...

That's right, and I've tested several devices. In this story, I'm both performer and luthier! Being a flautist by training, I tested virtual flutes, such as the Ri.corder recorder, but they didn't give good results: none of them had natural flute fingerings to control them, and the latency was often prohibitive due to a Bluetooth connection.
Of course, I tested the R-ioT sensors, developed by the ISMM team, which are excellent and very complete. However, in this case, it was the time needed to master and calibrate the tool, and then develop a suitable patch that was lacking. After all, we had the deadline for the concert with Joëlle Léandre and Horse Lords on June 16, 2023.
Then I found this ring that already exists on the market. It picks up two gestures (vertical acceleration and rotation, "tilt" and "roll", like on an airplane), so I can control two agents. What's more, since it establishes a radio-frequency connection, its latency is minimal. So, it's pretty easy to handle, and I've learned a lot simply by playing with it. Learning to play it came slowly over the months.

How do you play it, exactly?

With Somax, the ring provides a free hand so I can work directly in the software patch to choose the agents I control, the corpora they use, and fine-tune various parameters, as well as possibly triggering other events (synthesis, etc.) or even giving the machine more freedom. On the other hand, I have an immoderate taste for textures and I needed to elaborate more finely those generated by Somax. So I have one hand "interpreting" and the other "composing" in a patch I developed myself from the modular version of SOMAX2.
What's more, the patch I use changes according to the musician I'm working with, to suit his or her style. I have one patch to interact with Joëlle Léandre, another for Benny Sluchin, and so on.
In this way, I've developed an instrument capable of controlling the tool quickly and efficiently, and capable of providing instant responses to the musician's proposals. I'm even beginning to be able to create what look like scores to help the agents' behavior evolve over time. However, I'd like to make it clear that, when I perform with the ring, I also improvise my gestures in response to what the musician does.

Somax works from a body of music that it has " learned " and scans: did you have to choose one that was suited to this game with the ring?

Absolutely: I've built up a specific group of them. I even composed and recorded a small percussion piece for this very purpose. And as I go along, I include new sounds linked to the relationships I've built up with the musicians I play with.
An interesting story: for the concert we performed with Joëlle Léandre on June 16 as part of ManiFeste, I changed the corpus. The one I used was made up of sounds she'd never heard before: she thought she was hearing flute, but it was percussion! But Joëlle loves to be surprised, and her adaptability and flexibility in flowing with the moment astounded me, as it always does. In this case, it was very interesting, because she didn't let anything show: from time to time, improvisation sends its participants into a trance-like state.

What were the initial reactions of the musicians you improvised with?

The first time I used the ring was during one of our many work sessions with Joëlle Léandre (Gérard, Marco, and Manuel Poletti and I saw each other every two or three weeks for six months. All these trials, discussions, and arguments created a close bond that would influence the improvisation). I didn't tell him anything - just that I wanted to try out something new.
His first reaction was anger... and humor! Without thinking, because it was in my muscle memory, I immediately went for a conductor's gesture. Joëlle stopped immediately, bristling at these gestures, which she perceived as commanding. Since then, I've been trying to detach my movements from an imaginary sense of direction, or even percussion. But I don't want to have to hide this gesture, because it says something about the music being made. But neither should it become an obstacle to interaction with the musician.

Did Joëlle get used to it?

Yes, after that first time, I think she liked it. Because it was the first time in our exchanges that the machine, or one of us, reacted so quickly to what it proposed. Somax can be very reactive, but it has to know what to react to, i.e. which aspect (melodic, rhythmic, harmonic), and it has to be the most salient, the most relevant aspect. Without the ring, the interaction can be much less clear. The ring allows us to give an instant response, and Joëlle felt this feedback: her gesture was perceived and rendered. For my part, I have the feeling that she didn't react in the same way when she played with me, with or without the ring.
However, during the concert, she was turned slightly to one side, to avoid being distracted by my gestures. She tried to look away and concentrate on listening. My gestures were intended to interact with her, but also to express a musical energy, and I think she appreciated the context all the same: I could see her occasionally looking at me out of the corner of her eye.

Has there been any feedback from the public about this ring?

Yes. It undoubtedly fills a gap. Because the experience of music as a performance is often multimodal. We're not necessarily aware of it, but audiences attach great importance to the vision offered by musicians, especially when they're in the same space as them.
But it's not easy. To really play a controller is extremely difficult. It's a musical instrument for which you have to invent everything.