Reflection on the high-level representation of musical concepts and structures supported by original computer languages developed by the team, leads to the conception of models which can be used for musical analysis and creation.
On the musicology side, tools for representation and modeling enable a truly experimental approach that significantly rejuvenates this discipline.
On the creation side, the objective is to design musical companions that interact with composers, musicians, sound engineers, etc. throughout the musical workflow. The software developed has been distributed to a large community of musicians, materializing original forms of thought connected to the particular characteristics of the computer supports they represent (and execute): the final score, the score’s different levels of formal elaboration, its algorithmic generators, and live interaction during a performance.
For the past few years, the team has worked on symbolic interaction and artificial creativity in its work via projects on artificial listening, synchronization of musical signals and score following (a subject that led to the creation of an INRIA team-project), orchestration assistance (using the analysis of temporal series and deep learning techniques), and engineering intelligent agents capable of listening, learning, and musical interaction in improvised contexts.
The team has a long history of collaborations with composers and musicians both from IRCAM and elsewhere. Archives of this work can be found in three volumes of the OM Composer’s Book, guaranteeing its international dissemination and continuity.
- Computer-assisted composition: assisted composition, orchestration assistance
- Control of synthesis and spatialization, creative systems to write for time, sound, space, and interaction
- Mathematics and music
- Computer languages for music: Open Music, Antescofo
- Modeling style, dynamics of improvised interaction: improvised musical interactions, DYCi2
- New interfaces for composers and teaching
- Musicology and computational analysis
- Efficient search of temporal series
- Writing synchronous time
Research topics and related projects
Design models techniques adapted to the creative process, incorporating paradigms for calculations as well as musical interactions and representations
Orchestration par la recherche automatique d’instrumentations et de superpositions d’instruments approchant une cible définie par le compositeur
Code that makes it possible to carry out effective searches on temporal forms
Development of improvised man-machine musical interactions
Algebraic Models, Topologies, and Categories in Computational Musicology
Tool to assist orchestration that is integrated in the software created at IRCAM
New approach to the representation of sounds via high-level programs
Definition of complex interactions between performers and live electronics
European and national projects
Computational Shaping and Modeling of Musical Structures
Creative Dynamics for Improvised Interaction
Extended Frameworks For 'In-Time' Computer-Aided Composition
Hierarchical Object based Unsupervised Learning
Interactivity in Writing of Interaction and Time
Create the first partnership of a true scientific theory of orchestration
Mixed Musical Reality with Creative Instruments
Comparison of the state of the art in musicology, in psychology of perception, and in computer music to create new tools
Softwares (design & development)
Head Researcher : Gérard Assayag
Researchers & Engineers : Jérôme Nika, Corentin Guichaoua, Lawrence Fyfe, Mikhail Malt, Philippe Esling, Carlos Agon Amado, Jean-Louis Giavitto, Karim Haddad, Elaine Chew
Doctoral Students : Antoine Caillon, Daniel Bedoya, Martin Fouilleul, Théis Bazin, Mathieu Prang, José Miguel Fernandez, Jean-François Ducher, Adrien Bitton, Alessandro Ratoci, Tristan Carsault
Doctoral Student : Constance Douwes
Trainee : GonzalGonzaloo
Computer-assisted composition and analysis, computer musicology, cognitive musicology, artificial intelligence, computer languages, algebraic and geometric methods, symbolic interactions, languages for synchronous time and tempered time, executable notations
Bergen Center for Electronic Arts (Norvège), CIRMMT/McGill University (Canada), City University London, CNSMDP, Columbia New York, CNMAT/UC Berkeley, Electronic Music Foundation, Gmem, Grame Lyon, École normale supérieure Paris, EsMuC Barcelone, Harvard University, Inria, IReMus – Sorbonne Paris-4, Jyvaskyla University, univ. de Bologne, USC Los Angeles, université Marc Bloch Strasbourg, Pontificad Javeriana Cali, université Paris-Sud Orsay, université de Pise, UPMC Paris, UCSD San Diego, Yale, U. Minnesota, U. Washington.