Sound Music Movement Interaction

The Sound Music Movement Interaction team (previously known as the Real-Time Musical Interactions team) carries out research and development on interactive systems dedicated to music and performances.

Our work relates to all aspects of the interactive process, including the capture and multimodal analysis of the gestures and sounds created by musicians, tools for the synchronization and management of interaction, as well as techniques for real-time synthesis and sound processing. These research projects and their associated computer developments are generally carried out within the framework of interdisciplinary projects that include scientists, artists, teachers, and designers and find applications in creative projects, music education, movement learning, or in digital audio industrial fields.

Major Themes

Modeling and Analysis of Sounds and Gestures
This theme covers the theoretical developments concerning the analysis of the sound and gesture flow, or more generally, multi-modal temporal morphologies. This research concerns diverse techniques for audio analysis, the study of the gestures of performing musicians or dancers.

Technologies for Multimodal Interaction
This theme concerns our tools for analysis and multimodal recogntion of movements and sound; tools for synchronization (gesture following, for example) and visualization.

Interactive Sound Synthesis and Processing
This focuses essentially on synthesis and sound processing methods based on recorded sounds or large sound bodies.

Systems for Gesture Capture and Augmented Instruments
This theme focuses on the developments the team has made in terms of gestural interfaces and augmented instruments for music and performances.

Specialist Areas

Interactivity, real-time computer science, human-computer interaction, signal processing, motion capture, modeling sound and gesture, statistical modeling and automatic learning, real-time sound analysis and synthesis.

Team Website

  • R-IoT : Carte de captation gestuelle à 9 degrés de liberté avec transmission sans fil  © Philippe Barbosa
    R-IoT : Carte de captation gestuelle à 9 degrés de liberté avec transmission sans fil © Philippe Barbosa
  • Raquettes de Tennis connectées  © Philippe Barbosa
    Raquettes de Tennis connectées © Philippe Barbosa
  • MO - Modular Musical Objects  © NoDesign.net
    MO - Modular Musical Objects © NoDesign.net
  • Projet CoSiMa  © Philippe Barbosa
    Projet CoSiMa © Philippe Barbosa
  • Installation Siggraph, 2014  © DR
    Installation Siggraph, 2014 © DR


Research topics and related projects

Corpus-Based Concatenative Synthesis

Database of recorded sounds and a unit selection algorithm

Gesture analysIs and RecognItIon

Study of instrumental gesture and its relationship with both musical writing and the characteristics of the sound signal

The Augmented Instruments

Acoustic instruments that have been fitted with sensors

European and national projects

Cosima

Collaborative Situated Media

EFFICAC(e)

Extended Frameworks For 'In-Time' Computer-Aided Composition

Element

Stimulate Movement Learning in Humain-Machine Interactions

Gemme

Musical Gesture: Models and Experiments

Legos

Sensori-motor learning in gesture-based interactive sound systems

MICA

Musical Improvisation and Collective Action

MIM

Enhancing Motion Interaction through Music Performance

Music Bricks

Musical Building Blocks for Digital Makers and Content Creators

RapidMix

Real-time Adaptive Prototyping for Industrial Design of Multimodal Interactive eXpressive technology

Skat-VG

Sketching Audio Technologies using Vocalizations and Gestures

Wave

Web Audio: Editing/Visualization


Softwares (design & development)

product

MuBu for Max

Containers for sound and movement data in Max object form.
product

Gesture & Sound

Two max objects that let you follow temporal morphologies based on Markov models.
product

CataRT Standalone

 System for concatenative synthesis in real-time that lets you play sound “grains”.

Team


Head Researcher : Frédéric Bevilacqua
Researchers & Engineers : Benjamin Matuszewski, Diemo Schwarz, Riccardo Borghesi
Doctoral Student : Hugo Scurto
: Iseline Peyre, Marion Voillot

Collaborations

Atelier des feuillantines, BEK (Norway), CNMAT Berkeley (United States), Cycling’74 (United States), ENSAD, ENSCI, GRAME, HKU (Netherlands), Hôpital Pitié-Salpêtrière, ICK Amsterdam (Netherlands), IEM (Autria), ISIR-CNRS Sorbonne Université, Little Heart Movement, Mogees (United kingdom/Italia), No Design, Motion Bank (Germany), LPP-CNRS université Paris-Descartes, université Pompeu Fabra (Spain), UserStudio, CRI-Paris université Paris-Descartes, Goldsmiths University of London (United kingdom), université de Genève (Switzerland), LIMSI-CNRS université Paris-Sud, LRI-CNRS université Paris-Sud, Orbe.mobi, Plux (Portugal), ReacTable Systems (Spain), UCL (United kingdom), Univers Sons/Ultimate Sound bank, Universidad Carlos III Madrid (Spain), université de Gênes (Italia), université McGill (Canada), ZhDK (Switzerland).


Publications