Computer Assisted Composition: Writing Sound, Time, and Space
The purpose of research in computer-assisted composition (CAC) is to study and design models and computer techniques adapted to the creative process, incorporating paradigms for calculations as well as musical interactions and representations. This approach favors symbolic orientation using programming languages for artistic creation and processing harmonic, temporal, and rhythmic data in addition to other aspects that come into play in the compositional process. Our work in this domain is articulated primarily around the OpenMusic environment, a visual programming language based on Common Lisp and dedicated to musical composition. Contemporary music composers have used this environment for the past 15 years. Today, is regarded as one of the principle references in computer-assisted composition and has been downloaded by several thousand users from around the globe.
OpenMusic (OM) is a visual programming environment for composition or musical analysis assisted by computer. OM offers users a range of interconnected modules associated with specific functions, making up patches that enable the creation or transformation of musical data structures. OM also offers several editors to manipulate these data in addition to libraries in specialized sound analysis and synthesis, mathematical models, the resolution of constraint problems, etc. Unique interfaces like the maquette editor let users construct structures that include functional and temporal relationships among musical objects. OpenMusic is used by a large number of composers and musicologists; it is taught in all major computer-music centers and several universities worldwide.
Recently, a new calculation and programming paradigm was suggested for the OpenMusic environment, combining the existing demand-driven approach with a reactive approach inspired by event-driven, interactive real-time systems. The activation of reactive channels in visual programs increases the possibilities for interaction in a CAC environment: an event—a change or an action made by a user—made in a program or in the data it is made from, produce a series of reactions leading to an update (reevaluation). An event can also come from an outside source (typically, a MIDI port or an open UDP and attached to an element of the visual program). Two-way communication can be established between the visual programs and exterior applications or systems. The CAC environment finds itself inserted in the temporality of a larger system, potentially governed by events and interactions produced by or in this system. This temporality could be that of the compositional process, or that of the performance. This project has produced the OpenSource software OM#, which is now developed independently of IRCAM.
The technologies of sound signal analysis, processing, and synthesis allow us to envisage new writing modalities that assimilate sound creation at the heart of musical composition. OpenMusic allows the integration of such technologies via a set of specialized libraries linking programs created in the CAC environment to sound processing, synthesis or spatialization processes (realized notably by IrcamTools: SuperVP, Pm2, Chant, Modalys, Spat~, but also external tools such as Csound or Faust). This convergence of the fields of sound and CAC constitutes a new approach to sound representation and processing through programs and high-level symbolic data structures.
Developed in collaboration with the composer Marco Stroppa, the OMChroma library provides the ability to control sound synthesis processes using matrix data structures.
Its extension to for the field of spatialization, OMPrisma, allows the realization of .spatialized sound synthesis. processes, involving spatialization (positions and trajectories, but also room characteristics, orientation or directivity of sound sources) at the same time as sound production. Controlled in OpenMusic through a set of graphical editors and operators, these tools offer a range in the conjoint specification of synthesized sounds and spatialized scenes. The OM-Chant project has recently brought the technology of synthesis by FOFs (formantic wave functions) back to the forefront and has made it possible to create synthesized sounds, inspired by a speech production model, in the core of CAC processes.
IRCAM's Team: Musical Representations