• Research

    The fundamental principle of IRCAM is to encourage productive interaction among scientific research, technological developments, and contemporary music production. Since its establishment in 1977, this initiative has provided the foundation for the institute’s activities. One of the major issues is the importance of contributing to the renewal of musical expression through science and technology. Conversely, sp…

    • Research Topics
    • The STMS Lab
    • Research Teams
    • Sound Systems and Signals: Audio/Acoustics, InstruMents
    • Acoustic and Cognitive Spaces
    • Sound Perception and Design
    • Sound Analysis-Synthesis
    • Sound Music Movement Interaction
    • Musical Representations
    • Analysis of Musical Practices
    • Projects
    • Sound Workshop
    • The Musical Body
    • Creative Dynamics
    • Finished Projects
    • Musique/Sciences Collection
  • Creation

    IRCAM is an internationally recognized research center dedicated to creating new technologies for music. The institute offers a unique experimental environment where composers strive to enlarge their musical experience through the concepts expressed in new technologies.

    • Composers & Artists in Studio
    • "I am in blood" for Sixteen Musicians and Live Electronics
    • Lucie Antunes in studio
    • Deena Abdelwahed in the studio
    • "Transfer" for 10 Musicians and Electronics
    • L'Annonce faite à Marie
    • In Ex Machina
    • Jazz Ex Machina
    • Improvise cum machina 1/2
    • Improvise cum machina 2/2
    • Like Sound, Like Flesh
    • Silent Talks
    • Music-Fictions
    • Artistic Research Residency
    • Artistic Residencies: The Blog
    • Rendez Vous 20.21
    • Season 2022.23
    • Seasons from 1996 to present
    • ManiFeste-2022 Website
    • ManiFeste festival from 2012 to 2022
    • L’Étincelle, IRCAM’s journal of creation
  • Transmission

    In support of IRCAM's research and creation missions, the educational program seeks to shed light on the current and future meaning of the interactions among the arts, sciences, and technology as well as sharing its models of knowledge, know-how, and innovations with the widest possible audience.

    • 2022.23 Training Courses
    • Max, Max for Live
    • OpenMusic
    • Modalys
    • TS2 and Partiels
    • Sound spatialization
    • From PureData to audio plugins
    • Sensors, Interfaces, and Interactive Machine Learning
    • Other training programs
    • Practical Information
    • Advanced Programs
    • Cursus Program on Composition and Computer Music
    • Supersonic Chair
    • Master ATIAM
    • Sound Design Master's Program
    • Music Doctorate
    • AIMove Master
    • School Programs
    • Studios of Creation
    • Mixed-Music
    • Artistic and Cultural Education
    • Career Discovery Visit
    • Images of a Work Collection
    • ManiFeste-2022, the Academy
  • Innovations

    At the center of societal and economic concerns combining culture and information technologies, the current research at IRCAM is seen by the international research community as a reference for interdisciplinary projects on the sciences and technologies for sound and music, constantly exposed to society’s new needs and uses.

    • The IRCAM Forum
    • Subscribe to the Forum
    • Softwares
    • Ircam Amplify
    • Industrial Applications
    • Industrial Licenses
    • Forum Vertigo
  • IRCAM
  • Careers & Job Offers
  • Calls for applications
  • Newsletter
  • Arrive
  • Boutique
  • Resource Center
  • News
  • login
  • En
  • Fr
  • IRCAM
  • Careers & Job Offers
  • Calls for applications
  • Newsletter
  • Arrive
  • Boutique
  • Resource Center

Fr | En

  • login
  • Research

    The fundamental principle of IRCAM is to encourage productive interaction among scientific research, technological developments, and contemporary music production. Since its establishment in 1977, this initiative has provided the foundation for the institute’s activities. One of the major issues is the importance of contributing to the renewal of musical expression through science and technology. Conversely, sp…

    • Research Topics
    • The STMS Lab
    • Research Teams
    • Sound Systems and Signals: Audio/Acoustics, InstruMents
    • Acoustic and Cognitive Spaces
    • Sound Perception and Design
    • Sound Analysis-Synthesis
    • Sound Music Movement Interaction
    • Musical Representations
    • Analysis of Musical Practices
    • Projects
    • Sound Workshop
    • The Musical Body
    • Creative Dynamics
    • Finished Projects
    • Musique/Sciences Collection
  • Creation

    IRCAM is an internationally recognized research center dedicated to creating new technologies for music. The institute offers a unique experimental environment where composers strive to enlarge their musical experience through the concepts expressed in new technologies.

    • Composers & Artists in Studio
    • "I am in blood" for Sixteen Musicians and Live Electronics
    • Lucie Antunes in studio
    • Deena Abdelwahed in the studio
    • "Transfer" for 10 Musicians and Electronics
    • L'Annonce faite à Marie
    • In Ex Machina
    • Jazz Ex Machina
    • Improvise cum machina 1/2
    • Improvise cum machina 2/2
    • Like Sound, Like Flesh
    • Silent Talks
    • Music-Fictions
    • Artistic Research Residency
    • Artistic Residencies: The Blog
    • Rendez Vous 20.21
    • Season 2022.23
    • Seasons from 1996 to present
    • ManiFeste-2022 Website
    • ManiFeste festival from 2012 to 2022
    • L’Étincelle, IRCAM’s journal of creation
  • Transmission

    In support of IRCAM's research and creation missions, the educational program seeks to shed light on the current and future meaning of the interactions among the arts, sciences, and technology as well as sharing its models of knowledge, know-how, and innovations with the widest possible audience.

    • 2022.23 Training Courses
    • Max, Max for Live
    • OpenMusic
    • Modalys
    • TS2 and Partiels
    • Sound spatialization
    • From PureData to audio plugins
    • Sensors, Interfaces, and Interactive Machine Learning
    • Other training programs
    • Practical Information
    • Advanced Programs
    • Cursus Program on Composition and Computer Music
    • Supersonic Chair
    • Master ATIAM
    • Sound Design Master's Program
    • Music Doctorate
    • AIMove Master
    • School Programs
    • Studios of Creation
    • Mixed-Music
    • Artistic and Cultural Education
    • Career Discovery Visit
    • Images of a Work Collection
    • ManiFeste-2022, the Academy
  • Innovations

    At the center of societal and economic concerns combining culture and information technologies, the current research at IRCAM is seen by the international research community as a reference for interdisciplinary projects on the sciences and technologies for sound and music, constantly exposed to society’s new needs and uses.

    • The IRCAM Forum
    • Subscribe to the Forum
    • Softwares
    • Ircam Amplify
    • Industrial Applications
    • Industrial Licenses
    • Forum Vertigo
  • Home
  • Research
  • Research Teams
  • Sound Music Movement Interaction
  • Research Teams
  • Sound Systems and Signals: Audio/Acoustics, InstruMents
  • Acoustic and Cognitive Spaces
  • Sound Perception and Design
  • Sound Analysis-Synthesis
  • Sound Music Movement Interaction
    • Collaborations
    • Research topics and related projects
    • European and national projects
    • Softwares (design & development)
    • Team
    • Publications
  • Musical Representations
  • Analysis of Musical Practices






Sound Music Movement Interaction

Edit






The Sound Music Movement Interaction team (previously known as the Real-Time Musical Interactions team) carries out research and development on interactive systems dedicated to music and performances.
Edit






Our work relates to all aspects of the interactive process, including the capture and analysis of gestures and sounds, tools for the authoring of interaction and synchronization, as well as techniques for realtime synthesis and sound processing. These research projects and their associated softwares (MuBu for Max, CataRT, Soundworks), are generally carried out within the framework of interdisciplinary projects that include scientists, artists, teachers, and designers and find applications in creative projects, music education, movement learning, or in medical domains such as physical rehabilitation guided by sound and music.

Major Themes

  • Modeling and Analysis of Sounds and Gestures : this theme covers the theoretical developments concerning the analysis of the sound and gesture data, or more generally, multi-modal temporal morphologies. This research concerns diverse techniques for audio analysis, the study of the musician's gestures or dancers.
  • Interactive Sound Synthesis and Processing: : this focuses essentially on synthesis and sound processing methods based on recorded sounds or large collections of sound (corpus-based concatenative synthesis)
  • Interactive sound systems based on gesture and new instruments : this theme focuses on the design and development of interactive sound environments using gestures, movements, and touch. Interactive machine learning is one of the tools developed in this framework
  • Collective musical interaction and distributed systems : this theme addresses questions of musical interactions from a few users to hundreds. It concerns the development of a Web environment combining computers, smartphones, and/or embedded systems making it possible to explore new possibilities for expressive and synchronized interactions.

Specialist Areas

Interactive sound-systems, human-machine interaction, motion capture, modeling sound and gesture, real-time sound analysis and synthesis, statistical modeling and interactive machine learning, sound signal processing, distributed interactive systems.

Website Team

Edit
  • R-IoT : Carte de captation gestuelle à 9 degrés de liberté avec transmission sans fil  © Philippe Barbosa
    R-IoT : Carte de captation gestuelle à 9 degrés de liberté avec transmission sans fil © Philippe Barbosa
  • Raquettes de Tennis connectées  © Philippe Barbosa
    Raquettes de Tennis connectées © Philippe Barbosa
  • MO - Modular Musical Objects  © NoDesign.net
    MO - Modular Musical Objects © NoDesign.net
  • Projet CoSiMa  © Philippe Barbosa
    Projet CoSiMa © Philippe Barbosa
  • Installation Siggraph, 2014  © DR
    Installation Siggraph, 2014 © DR

Collaborations

Atelier des feuillantines, BEK (Norway), CNMAT Berkeley (United States), Cycling’74 (United States), ENSAD, ENSCI, GRAME, HKU (Netherlands), Hôpital Pitié-Salpêtrière, ICK Amsterdam (Netherlands), IEM (Autria), ISIR-CNRS Sorbonne Université, Little Heart Movement, Mogees (United kingdom/Italia), No Design, Motion Bank (Germany), LPP-CNRS université Paris-Descartes, université Pompeu Fabra (Spain), UserStudio, CRI-Paris université Paris-Descartes, Goldsmiths University of London (United kingdom), université de Genève (Switzerland), LIMSI-CNRS université Paris-Sud, LRI-CNRS université Paris-Sud, Orbe.mobi, Plux (Portugal), ReacTable Systems (Spain), UCL (United kingdom), Univers Sons/Ultimate Sound bank, Universidad Carlos III Madrid (Spain), université de Gênes (Italia), université McGill (Canada), ZhDK (Switzerland).


Research topics and related projects

The Augmented Instruments

Acoustic instruments that have been fitted with sensors

European and national projects

DAFNE+

Decentralized platform for fair creative content distribution empowering creators and communities through new digital distribution models based on digital tokens

DOTS

Distributed Music Objects for Collective Interaction

Element

Stimulate Movement Learning in Humain-Machine Interactions

MICA

Musical Improvisation and Collective Action

Aqua-Rius

Analyse de la qualité audio pour représenter, indexer et unifier les signaux



Softwares (design & development)

product

MuBu for Max

free
MuBu (for “multi-buffer) is a set of modules for real-time multimodal signal processing (audio and movement), automatic learning, and sound synthesis via descriptors. Using the multimodal MuBu container users can store, edit, and visualize different types of temporally synchronized channels.
download
product

Gesture & Sound

free
Two max objects that let you follow temporal morphologies based on Markov models. Software modules for gesture-sound interactions The VoiceFollower allows synchronisation of sound and visual processes with pre-recorded voice. The MotionFollower allows synchronisation of sound and visual processes with pre-recorded movement.
download
product

CataRT Standalone

included in your membership
Concatenative corpus-based synthesis makes use of a database of recorded sounds and an algorithm for the selection of units that makes it possible to choose the segments of the database in order to synthesize by concatenation a musical sequence. The selection is based on the characteristics of the recording that are obtained by an analysis of the …
subscribemore

Team


Head Researcher : Frederic Bevilacqua
Researchers & Engineers : Benjamin Matuszewski, Diemo Schwarz, Riccardo Borghesi, Iseline Peyre
Doctoral Student : Victor Paredes
Trainee : Matéo Fayet
: Sarah Nabi, Aliénor Golvet
Engineer : Coralie Vincent

Collaborations

Atelier des feuillantines, BEK (Norway), CNMAT Berkeley (United States), Cycling’74 (United States), ENSAD, ENSCI, GRAME, HKU (Netherlands), Hôpital Pitié-Salpêtrière, ICK Amsterdam (Netherlands), IEM (Autria), ISIR-CNRS Sorbonne Université, Little Heart Movement, Mogees (United kingdom/Italia), No Design, Motion Bank (Germany), LPP-CNRS université Paris-Descartes, université Pompeu Fabra (Spain), UserStudio, CRI-Paris université Paris-Descartes, Goldsmiths University of London (United kingdom), université de Genève (Switzerland), LIMSI-CNRS université Paris-Sud, LRI-CNRS université Paris-Sud, Orbe.mobi, Plux (Portugal), ReacTable Systems (Spain), UCL (United kingdom), Univers Sons/Ultimate Sound bank, Universidad Carlos III Madrid (Spain), université de Gênes (Italia), université McGill (Canada), ZhDK (Switzerland).


Publications

Latest articles

News
Research

Launch: Post-Master’s Degree in Artificial Intelligence and Movement

News
MINES ParisTech, in partnership with the IRCAM (France), the Center for Research and Technology Hellas (Greece), the Idiap Research Institute (Switzerland) and the INREV Paris8 University (France), l…
News
Research

01'30 in Stravinsky with Pavlos Antoniadis

Média
Pavlos Antoniadis will defend his doctoral  carried out  in the Sound  Music Movement Interaction team (STMS - CNRS/IRCAM/Sorbonne Université) last June: " GesTCom: A sensor-bas…
News
Research

Orbe is in the running for the 2018 edition of the Innovation Radar Prize

Article
Launched by the European Commission in 2015, the Innovation Radar Prize competition identifies Europe's top innovators and their innovations found in projects supported by the European H2020 program.…
Ircam

1, place Igor-Stravinsky
75004 Paris
T. +33 1 44 78 48 43
Opening times

Monday through Friday 9:30am-7pm Closed Saturday and Sunday
Subway access

Hôtel de Ville, Rambuteau, Châtelet, Les Halles
  • Jobs Offers & Internships
  • The IRCAM team
  • Partners
  • Support IRCAM
Institut de Recherche et de Coordination Acoustique/Musique

go to :
  • Centre Pompidou
  • Legal Notes
  • General Conditions
  • Espace Pro
Copyright © 2023 Ircam. All rights reserved.