Writing for Sound and Space
The technology used for sound analysis, synthesis and signal processing have made it possible to imagine significant, even new, possibilities for composition. In this prospect, new functions have been developed in the OpenMusic environment to include these technologies in the context of the compositional process. Several specialized libraries make it possible to connect programs created in a computer-assisted composition environment with processes for sound processing or synthesis (created by IRCAM tools such as SupervP, pm2, CHANT, Modalys, Spat, but also tools such as Csound and Faust). Bringing the domain of sound synthesis and computer-assisted composition together is a new approach to the representation of sounds via high-level programs that put structures of abstract data and models of advanced temporal structures to work, permitting total control of the processes of sound processing and generation of synthesis data.
Developed in collaboration with Marco Stroppa, the OMChroma library makes it possible to control the processes of sound synthesis with the help of matrix data structures. OMChroma’s extension for spatialization, OMPrisma, enables to implement "spatialized sound synthesis" processes, calling on spatialization (positions and
trajectories, but also room characteristics, sound source orientation and directivity) when sounds are produced. Controlled in OpenMusic via an ensemble of graphical editors and operators, these tools propose a wealth of possibilities in the conjoint specification of synthesized sounds and spatialized spaces.
The OM-Chant project also made it possible to bring back the technology of FOF (Formant Wave Functions) synthesis at the heart of the process of computer-assisted composition, and to create at the core of CAC processes synthesized sounds inspired by the model of vocal production.
The theme of space, and the collaboration with the Acoustic and Cognitive Spaces team, has been reinforced with a project on the conception of interfaces to control sound spatialization processes. Using a participatory design approach, interviews were held with composers to better understand their needs with the goal of offering tools to facilitate the control and writing of spatialization. These interviews revealed the necessity of interactive tools to enter, visualize, and manipulate control data for spatialization, and led to the design of different prototypes.
OpenMusic objects and tools for 2D/3D spatial conception are based on a model of delayed data, enabling a homogenous management of temporal specifications and transformations. Connected to a sequencing motor and high-level temporal structures, these data make it possible to represent sound scenes and their evolution over time. An interactive object was developed using models of visualization, interaction, and spatialized rendering from the Spat library.
IRCAM's Team: Music Representations team.