• Research

    The fundamental principle of IRCAM is to encourage productive interaction among scientific research, technological developments, and contemporary music production. Since its establishment in 1977, this initiative has provided the foundation for the institute’s activities. One of the major issues is the importance of contributing to the renewal of musical expression through science and technology. Conversely, sp…

    • Research Topics
    • The STMS Lab
    • Research Teams
    • Sound Systems and Signals: Audio/Acoustics, InstruMents
    • Acoustic and Cognitive Spaces
    • Sound Perception and Design
    • Sound Analysis-Synthesis
    • Sound Music Movement Interaction
    • Musical Representations
    • Analysis of Musical Practices
    • Projects
    • Sound Workshop
    • The Musical Body
    • Creative Dynamics
    • Finished Projects
    • Musique/Sciences Collection
  • Creation

    IRCAM is an internationally recognized research center dedicated to creating new technologies for music. The institute offers a unique experimental environment where composers strive to enlarge their musical experience through the concepts expressed in new technologies.

    • Composers & Artists in Studio
    • "I am in blood" for Sixteen Musicians and Live Electronics
    • Lucie Antunes in studio
    • Deena Abdelwahed in the studio
    • "Transfer" for 10 Musicians and Electronics
    • L'Annonce faite à Marie
    • In Ex Machina
    • Jazz Ex Machina
    • Improvise cum machina 1/2
    • Improvise cum machina 2/2
    • Like Sound, Like Flesh
    • Silent Talks
    • Music-Fictions
    • Artistic Research Residency
    • Artistic Residencies: The Blog
    • Rendez Vous 20.21
    • Season 2022.23
    • Seasons from 1996 to present
    • ManiFeste-2022 Website
    • ManiFeste festival from 2012 to 2022
    • L’Étincelle, IRCAM’s journal of creation
  • Transmission

    In support of IRCAM's research and creation missions, the educational program seeks to shed light on the current and future meaning of the interactions among the arts, sciences, and technology as well as sharing its models of knowledge, know-how, and innovations with the widest possible audience.

    • 2022.23 Training Courses
    • Max, Max for Live
    • OpenMusic
    • Modalys
    • TS2 and Partiels
    • Sound spatialization
    • From PureData to audio plugins
    • Sensors, Interfaces, and Interactive Machine Learning
    • Other training programs
    • Practical Information
    • Advanced Programs
    • Cursus Program on Composition and Computer Music
    • Supersonic Chair
    • Master ATIAM
    • Sound Design Master's Program
    • Music Doctorate
    • AIMove Master
    • School Programs
    • Studios of Creation
    • Mixed-Music
    • Artistic and Cultural Education
    • Career Discovery Visit
    • Images of a Work Collection
    • ManiFeste-2022, the Academy
  • Innovations

    At the center of societal and economic concerns combining culture and information technologies, the current research at IRCAM is seen by the international research community as a reference for interdisciplinary projects on the sciences and technologies for sound and music, constantly exposed to society’s new needs and uses.

    • The IRCAM Forum
    • Subscribe to the Forum
    • Softwares
    • Ircam Amplify
    • Industrial Applications
    • Industrial Licenses
    • Forum Vertigo
  • IRCAM
  • Careers & Job Offers
  • Calls for applications
  • Newsletter
  • Arrive
  • Boutique
  • Resource Center
  • News
  • login
  • En
  • Fr
  • IRCAM
  • Careers & Job Offers
  • Calls for applications
  • Newsletter
  • Arrive
  • Boutique
  • Resource Center

Fr | En

  • login
  • Research

    The fundamental principle of IRCAM is to encourage productive interaction among scientific research, technological developments, and contemporary music production. Since its establishment in 1977, this initiative has provided the foundation for the institute’s activities. One of the major issues is the importance of contributing to the renewal of musical expression through science and technology. Conversely, sp…

    • Research Topics
    • The STMS Lab
    • Research Teams
    • Sound Systems and Signals: Audio/Acoustics, InstruMents
    • Acoustic and Cognitive Spaces
    • Sound Perception and Design
    • Sound Analysis-Synthesis
    • Sound Music Movement Interaction
    • Musical Representations
    • Analysis of Musical Practices
    • Projects
    • Sound Workshop
    • The Musical Body
    • Creative Dynamics
    • Finished Projects
    • Musique/Sciences Collection
  • Creation

    IRCAM is an internationally recognized research center dedicated to creating new technologies for music. The institute offers a unique experimental environment where composers strive to enlarge their musical experience through the concepts expressed in new technologies.

    • Composers & Artists in Studio
    • "I am in blood" for Sixteen Musicians and Live Electronics
    • Lucie Antunes in studio
    • Deena Abdelwahed in the studio
    • "Transfer" for 10 Musicians and Electronics
    • L'Annonce faite à Marie
    • In Ex Machina
    • Jazz Ex Machina
    • Improvise cum machina 1/2
    • Improvise cum machina 2/2
    • Like Sound, Like Flesh
    • Silent Talks
    • Music-Fictions
    • Artistic Research Residency
    • Artistic Residencies: The Blog
    • Rendez Vous 20.21
    • Season 2022.23
    • Seasons from 1996 to present
    • ManiFeste-2022 Website
    • ManiFeste festival from 2012 to 2022
    • L’Étincelle, IRCAM’s journal of creation
  • Transmission

    In support of IRCAM's research and creation missions, the educational program seeks to shed light on the current and future meaning of the interactions among the arts, sciences, and technology as well as sharing its models of knowledge, know-how, and innovations with the widest possible audience.

    • 2022.23 Training Courses
    • Max, Max for Live
    • OpenMusic
    • Modalys
    • TS2 and Partiels
    • Sound spatialization
    • From PureData to audio plugins
    • Sensors, Interfaces, and Interactive Machine Learning
    • Other training programs
    • Practical Information
    • Advanced Programs
    • Cursus Program on Composition and Computer Music
    • Supersonic Chair
    • Master ATIAM
    • Sound Design Master's Program
    • Music Doctorate
    • AIMove Master
    • School Programs
    • Studios of Creation
    • Mixed-Music
    • Artistic and Cultural Education
    • Career Discovery Visit
    • Images of a Work Collection
    • ManiFeste-2022, the Academy
  • Innovations

    At the center of societal and economic concerns combining culture and information technologies, the current research at IRCAM is seen by the international research community as a reference for interdisciplinary projects on the sciences and technologies for sound and music, constantly exposed to society’s new needs and uses.

    • The IRCAM Forum
    • Subscribe to the Forum
    • Softwares
    • Ircam Amplify
    • Industrial Applications
    • Industrial Licenses
    • Forum Vertigo
  • Home
  • Research
  • Research Teams
  • Musical Representations
  • Research Teams
  • Sound Systems and Signals: Audio/Acoustics, InstruMents
  • Acoustic and Cognitive Spaces
  • Sound Perception and Design
  • Sound Analysis-Synthesis
  • Sound Music Movement Interaction
  • Musical Representations
    • Specialist Areas
    • Research topics and related projects
    • European and national projects
    • Softwares (design & development)
    • Team
    • Specialist Areas
    • Collaborations
    • Publications
  • Analysis of Musical Practices






Musical Representations

Edit






The Musical Representations team works on the formal structures of music and creative environments for composition and musical interaction.
Edit






This work finds application in computer-assisted composition (CAC), performance, improvisation, performance and computational musicology. Reflection on the high-level representation of musical concepts and structures supported by original computer languages developed by the team, leads to the conception of models which can be used for musical analysis and creation for composition, performances, and improvisation.

On the musicology side, tools for representation and modeling enable a truly experimental approach that significantly rejuvenates this discipline.

On the creation side, the objective is to design musical companions that interact with composers, musicians, sound engineers, etc. throughout the musical workflow. The software developed has been distributed to a large community of musicians, materializing original forms of thought connected to the particular characteristics of the computer supports they represent (and execute): the final score, its score’s different levels of formal elaboration, its algorithmic generators, its sonic productions, making live interaction possible during a performance, even when improvising. The team integrates symbolic interaction and artificial creativity throughout its work on the modeling of improvisation and the integration of new open and dynamic compositional forms. This research allows advances in artificial intelligence, with models of listening, generative learning, synchronization, and lays the foundation for new technologies of creative agents who can become musical companions with artificial musicality (“machine musicianship”). This prefigures the cooperative dynamics inherent in cyber-human networks and the emergence of forms of human-machine co-creativity.

The team has a long history of collaborations with composers and musicians both from IRCAM and elsewhere. Archives of this work can be found in three volumes of the OM Composer’s Book, guaranteeing its international dissemination and continuity.

Major Themes

  • Computer-assisted composition
  • Orchestration : computer-assisted orchestration
  • Control of synthesis and spatialization, write for time, sound, computer assisted composition
  • Mathematics and music
  • Computer languages for creation : Open Music, Antescofo
  • The dynamics of improvised interaction, co-creativity
  • Creative artificial intelligence
  • Studies on musical structures in performance
  • Studies in connection with therapy

Team Website

Edit
  • Premier collage de l'identité du logiciel OpenMusic par A. Mohsen  © Philippe Barbosa
    Premier collage de l'identité du logiciel OpenMusic par A. Mohsen © Philippe Barbosa
  • Mederic Collignon improvise en concert avec le logiciel OMax (RIM B. Lévy)
    Mederic Collignon improvise en concert avec le logiciel OMax (RIM B. Lévy)
  • Moreno Andreatta, chercheur CNRS-Ircam dans son bureau  © Philippe Barbosa
    Moreno Andreatta, chercheur CNRS-Ircam dans son bureau © Philippe Barbosa
  • OM Composer's Book
    OM Composer's Book
  • Claude Delangle en studio avec le logiciel Antescofo  © Inria / H. Raguet
    Claude Delangle en studio avec le logiciel Antescofo © Inria / H. Raguet
  • Papier Intelligent, écriture de l’espace dans la composition  © Inria
    Papier Intelligent, écriture de l’espace dans la composition © Inria

Specialist Areas

Computer-assisted composition and analysis, computer musicology, cognitive musicology, artificial intelligence, computer languages, algebraic and geometric methods, symbolic interactions, languages for synchronous time and tempered time, executable notations


Research topics and related projects

Computer-Assisted Composition

Orchestration via an automatic search of instrumentation and layering instruments approaching a target defined by the composer

Computer Assisted Composition: Writing Sound, Time, and Space

Design models techniques adapted to the creative process, incorporating paradigms for calculations as well as musical interactions and representations

Mathematics and Music

Algebraic Models, Topologies, and Categories in Computational Musicology

Somax2

Somax 2 is an application for musical improvisation and composition. It is implemented in Max and is based on a generative model using a process similar to concatenative synthesis to provide stylistically coherent improvisation, while listening to and adapting to a musician (or any other type of audio or MIDI source) in real-time. The model is operating in the symbolic domain and is trained on a musical corpus, consisting of one or multiple MIDI files, from which it draws its material used for improvisation. The model can be used with little configuration to autonomously interact with a musician, but it also allows manual control of its generative process, effectively letting the model serve as an instrument that can be played in its own right.

European and national projects

ACIDS

Artificial Creative Intelligence and Data Science

ACTOR

Analysis, Creation, and Teaching of Orchestration

REACH

Raising Co-Creativity in Cyber-Human Musicianship

ACTOR

Analysis, Creation, and Teaching of Orchestration

COSMOS

Computational Shaping and Modeling of Musical Structures

DAFNE+

Decentralized platform for fair creative content distribution empowering creators and communities through new digital distribution models based on digital tokens

Heart.FM

Maximizing the Therapeutic Potential of Music through Tailored Therapy with Physiological Feedback in Cardiovascular Disease

MERCI

Mixed Musical Reality with Creative Instruments

REACH

Raising Co-Creativity in Cyber-Human Musicianship



Softwares (design & development)

product

OpenMusic

free
OpenMusic is a visual programming environment for creating computer aided composition applications. OpenMusic offers users the possibility of graphical construction for processing procedures or generation of musical data with the assistance of numerous graphical modules and predefined functions, assembled in visual programs.
download
product

Antescofo

free
Antescofo is a modular score following system as well as a synchronous programming language for musical composition. The module allows for automatic recognition of the player’s position and tempo in a musical score from a realtime audio stream coming from a performer, thus making it possible to synchronize an instrumental performance with computer…
download
product

OMax

free
Omax is an environment for improvisation with a computer that analyzes, models, and re-improvises in real-time the performance of one or several musicians, in audio or in MIDI formats. OMax is based on a computer model called “Oracle Factors”, a graph that connects all the motives, from smallest to biggest.
download
product

Orchids

included in your membership
Orchids/Éric is the first complete system for temporal computer-assisted orchestration and timbral mixture optimization. It provides a set of algorithms and features to reconstruct any time-evolving target sound with a combination of acoustic instruments, given a set of psychoacoustic criteria.
subscribelearn more
product

Musique Lab 2.0

Environment designed to assist musical education based on OpenMusic, IRCAM’s computer-assisted orchestration software.
learn more

Team


Head Researcher : Gerard Assayag
Researchers & Engineers : Mikhail Malt, Jerome Nika, Carlos Agon Amado, Jean-Louis Giavitto, Karim Haddad, Joakim Borg, Marco Fiorini, Emily Graber, Emma Frid
Doctoral Students : Yohann Rabearivelo, David Genova, Nils Demerlé, Giovanni Bindi, , Gonzalo Romero, Paul Lascabettes
Trainee : Lola-Marie Ferly
Doctoral Student : Ninon Devis
Administrative : Vasiliki Zachari
: Sasha J. Blondeau, Claudy Malherbe
Associated Researcher : Georges Bloch

Specialist Areas

Computer-assisted composition and analysis, computer musicology, cognitive musicology, artificial intelligence, computer languages, algebraic and geometric methods, symbolic interactions, languages for synchronous time and tempered time, executable notations


Collaborations

Bergen Center for Electronic Arts (Norvège), CIRMMT/McGill University (Canada), City University London, CNSMDP, Columbia New York, CNMAT/UC Berkeley, Electronic Music Foundation, Gmem, Grame Lyon, École normale supérieure Paris, EsMuC Barcelone, Harvard University, Inria, IReMus – Sorbonne Paris-4, Jyvaskyla University, univ. de Bologne, USC Los Angeles, université Marc Bloch Strasbourg, Pontificad Javeriana Cali, université Paris-Sud Orsay, université de Pise, UPMC Paris, UCSD San Diego, Yale, U. Minnesota, U. Washington.


Publications

Also discover

News
Research

01'30 in Stravinsky with Pierre Talbot, PhD student

Média
Pierre Talbot will defend his doctoral  carried out  in the Musical Representations team (STMS - CNRS/IRCAM/Sorbonne Université) last June: Spacetime Programming: A Synchronous Langu…
Ircam

1, place Igor-Stravinsky
75004 Paris
T. +33 1 44 78 48 43
Opening times

Monday through Friday 9:30am-7pm Closed Saturday and Sunday
Subway access

Hôtel de Ville, Rambuteau, Châtelet, Les Halles
  • Jobs Offers & Internships
  • The IRCAM team
  • Partners
  • Support IRCAM
Institut de Recherche et de Coordination Acoustique/Musique

go to :
  • Centre Pompidou
  • Legal Notes
  • General Conditions
  • Espace Pro
Copyright © 2023 Ircam. All rights reserved.