Sound Music Movement Interaction

The Sound Music Movement Interaction team (formerly Real-Time Musical Interactions) conducts research and development on interactive systems dedicated to music and the performing arts.

Résidence de recherche artistique à l’Ircam, expérience avec 40 Raspberry Pi connectés et synchronisés via le réseau à l'Espace de projection

Résidence de recherche artistique à l’Ircam, expérience avec 40 Raspberry Pi connectés et synchronisés via le réseau à l'Espace de projection

Our work addresses the entire interactive chain, from the capture and analysis of gestures and sounds to tools for interaction management, generation, and synchronization. It also includes techniques for real-time sound synthesis and processing. These research activities, and their associated software developments, are generally carried out within interdisciplinary projects bringing together scientists, artists, educators, and designers. They find applications in artistic creation projects, music education, movement learning, and in medical fields such as rehabilitation guided by sound and music.

Frederic Bevilacqua

Responsable d'équipe

Diemo Schwarz

Chercheur

Jerome Nika

Chercheur

Giorgia CANTISANI

Chercheur invité

Coralie Vincent

Ingénieure

Louise Grebel

Doctorante

Léo Mercier

Doctorant

  • Gesture and movement modeling and sensorimotor learning: this theme brings together the development of behavioral or neural models, based on experimental studies, generally concerning sound–movement interaction while taking perceptual aspects into account, ranging from instrumental performance to dance movement.
  • Interactive sound synthesis and processing: this theme mainly includes sound synthesis and processing methods based on recorded sounds or large sound corpora.
  • Gesture-based sound interactive systems and movement sonification: this theme concerns the design and development of interactive sound environments using gestures, movements, and touch. Interactive machine-learning methods are developed specifically for these uses. Movement sonification has applications in health, particularly in rehabilitation (Sound–Music–Health axis).
  • Composition and interaction design with musical processes: this theme concerns the design of interaction strategies with audio and/or musical synthesis, as well as the customization of this interaction by artists, notably through modeling from examples of relationships between control streams (audio or symbolic) and the sound material to be synthesized, using various machine-learning models.
  • Collective musical interaction and distributed systems: this theme addresses questions of musical interaction ranging from a few users to several hundred. It focuses in particular on the development of a web environment combining computers, microcomputers, smartphones and/or embedded systems, in order to explore new possibilities for expressive and synchronized interactions.

Interactive sound systems, human–machine interaction, motion capture, sound and gesture modeling, real-time sound analysis and synthesis, statistical modeling and interactive machine learning, signal processing, distributed interactive systems, movement sonification.

Related Projects

See all projects

OpenTuning

Individuation créative par le design d'interaction avec des systèmes musicaux génératifs

Dates : March 2026 to December 2029

Inside Artificial Improvisation

Dans la boîte noire de l’improvisation artificielle

Dates : January 2026 to December 2029

Koral

Playing music collectively using smartphones

Dates : December 2023 to December 2024

Related Software

Software

Soundworks

Full-stack JavaScript framework for distributed WebAudio and multimedia applications.

Free software

Gestural interactions

Software

Max Sound Box

Real-time Interaction Modules for Max.

Free software

Sound design and processing

Voice processing

Software

Gesture & Sound

Gesture-sound modules: VoiceFollower syncs sound and visuals with a pre-recorded voice; MotionFollower syncs sound and visuals with pre-recorded movement.

Free software

Voice processing

Gestural interactions

With the supervision of:

IrcamSorbonne UniversityCNRSMinistry of Culture