As part of his artistic research residency at IRCAM with the Analysis of Musical Practices team in the STMS lab, New-York resident and German composer Simon Kanzler intends to explore the notions of synchronisation and desyncrhonisation for space recomposition, via a multidimensional approach.
Time is relative – that we know at least since Einstein – just like, without a doubt, how we experience it. While this issue is well known in the fields of physics and philosophy, how does it relate to music, often described as an “art of time,” in the sense that it exists only in time? This holds true for the sound wave, inseparable from its frequency (which gives us the number of oscillations of air pressure within a given time interval), but also for rhythm and musical form, which are different ways of structuring sound material over time. This raises the following question: what does time relativity presuppose, represent, and provoke with regard to musical expression? That is what Simon Kanzler, who comes from an unusual background, has chosen to explore as part of his artistic research residency. Both a composer who graduated from IRCAM’s Cursus and a developer specializing in digital instruments designed for improvisation which he uses in his own work, Simon Kanzler originally trained in jazz.
Simon Kanzler © Deborah Lopatin
“My musical background is different from that of most composers working at IRCAM,” explains Kanzler. “I enjoy groove, and I have always been fascinated by highly rhythmic music. When I compose, it’s something that’s always on my mind and that I naturally turn to when creating structures and trying to extend their language or form, even when I’m working with algorithmic processes. For a long time now, I’ve been interested in how polytempo (involving different tempos) and polymeter (overlapping different rhythmic structures) structures can be used to organize musical forms in novel ways, to move beyond linear processes, and to help create a complex, spatialized, immersive polyphony for listeners – with different instrumental groups and electronic diffusion devices distributed throughout the space. I’m equally interested in pure composition and engineering, and I want to develop tools for the creation and notational representation of sounds within that musical genre, both for my own work and for other artists.”
Luckily, his reflections are shared by Clément Canonne, head of the Analysis of Musical Practices team, who, among other research topics, works on temporal desynchronization, particularly in the context of improvised music such as jazz. One of the first steps of the residency was therefore to build on the work carried out by the APM team, analyzing contemporary and other improvised musical works that explore the concepts of desynchronization (as in Steve Reich’s phase-shifting techniques) or spatial composition (as in Charles Ives’s experiments with fragmenting an ensemble into several distinct entities).
“This research was the starting point for my composition sketches. Bach’s library already offers several tools that can help produce polymetric and polytemporal discourse, by generating out-of-phase measures and the corresponding score. From there, I began studio work, including binaural experiments, first focusing on simple loops before moving on to non-repetitive structures. The next step involved putting these sketches to the test. Together with Clément, we explored different systems – either through audio simulations or with musicians – to analyze how these desynchronization phenomena are perceived.
For instance, in a situation where different tempos are played simultaneously without a click track[1], is it more difficult for musicians to maintain their own tempo when these tempos are far apart, or conversely, when they are very close together? How does an ensemble adapt to a tempo shift when guided by a click track? Does this vary depending on the context, the number of performers, the type of music, or the range of tempo shifts? All this data is highly interesting, particularly when applied to musical notation. It is partly for the opportunity to conduct such experiments that I have already begun working with the Swiss percussion ensemble Eklekto.”
“During our working sessions, we experimented under various conditions: with and without a click track, with tempos that were either far apart or close together, changing continuously or by increments, and involving similar or contrasting timbres. Some of the sketches I had created using dynamic synchronization models also allowed us to explore different polytempo approaches. I believe the sketches that produced the most compelling results were those featuring variations and subtle shifts in pace and duration, generating a gradual transition from a desynchronized to a synchronized state. From a strictly formal perspective, this also provides an opportunity to explore the dramaturgical potential of the tension between desynchronization and synchronization, much like the tension between dissonance and consonance in classical harmony. What makes my situation unique is that I can stay very close to the sound itself.”
My long-term objective for this residency is to develop computer-aided composition tools that give artists the ability to control the three dimensions of time, tone, and space, and to test the sonic outcome through score simulation in Bach’s environment in Max.
“For some time now, I have been developing my own environment using the LISP[2] programming language, which I use to generate the processes driving my compositions. It provides a solid working base that still needs to be developed further to become a proper tool. In addition to this score-generation approach, my goal is to develop, in the future, synchronization models to control electronics in real time.”
“To focus on this subject, a new two-year research project has been initiated by Carmine Emanuele Cella. I have already implemented, in Max and in the programming language Antescofo, a new model called the ‘Circle Map Phase Oscillator Model.’[3] I work with sound masses composed of numerous independent agents, each with its own tempo, but all following the same master clock. This allows me to move from a completely desynchronized state – reminiscent of a swarm-like granular texture – to a perfectly synchronized state that produces pulse and rhythmic patterns with a clearly discernible tempo. This is the process I used in my sketches for percussion ensemble, but extended here to many more voices. My goal is to combine this system, which I am calling ‘pulse2texture,’ with my score-generation approach in Max/Bach/LISP, in order to control electronics in Antescofo and generate scores in Bach’s environment. This could create new ways for artists and electronics to interact.”
![]()
Jérémie Szpirglas
[1] A click-track works like a digital version of a metronome: it helps maintaining a constant tempo during studio recording sessions, but also gives precious insights to the musician who is recording, indicating changes in tempo or temporal signature in the track.
[2] LISP is the oldest programming language family that is at the same time imperative and functional.
[3] As described in “Dynamical models for musical rhythm perception and coordination” by Edward W. Large.