- Research Teams
- Sound Systems and Signals: Audio/Acoustics, InstruMents
- Acoustic and Cognitive Spaces
- Sound Perception and Design
- Sound Analysis-Synthesis
- Sound Music Movement Interaction
- Musical Representations
- Analysis of Musical Practices
Sound Music Movement Interaction
Our work relates to all aspects of the interactive process, including the capture and analysis of gestures and sounds, tools for the authoring of interaction and synchronization, as well as techniques for realtime synthesis and sound processing. These research projects and their associated softwares (MuBu for Max, CataRT, Soundworks), are generally carried out within the framework of interdisciplinary projects that include scientists, artists, teachers, and designers and find applications in creative projects, music education, movement learning, or in medical domains such as physical rehabilitation guided by sound and music.
- Modeling and Analysis of Sounds and Gestures : this theme covers the theoretical developments concerning the analysis of the sound and gesture data, or more generally, multi-modal temporal morphologies. This research concerns diverse techniques for audio analysis, the study of the musician's gestures or dancers.
- Interactive Sound Synthesis and Processing: : this focuses essentially on synthesis and sound processing methods based on recorded sounds or large collections of sound (corpus-based concatenative synthesis)
- Interactive sound systems based on gesture and new instruments : this theme focuses on the design and development of interactive sound environments using gestures, movements, and touch. Interactive machine learning is one of the tools developed in this framework
- Collective musical interaction and distributed systems : this theme addresses questions of musical interactions from a few users to hundreds. It concerns the development of a Web environment combining computers, smartphones, and/or embedded systems making it possible to explore new possibilities for expressive and synchronized interactions.
Interactive sound-systems, human-machine interaction, motion capture, modeling sound and gesture, real-time sound analysis and synthesis, statistical modeling and interactive machine learning, sound signal processing, distributed interactive systems.
Atelier des feuillantines, BEK (Norway), CNMAT Berkeley (United States), Cycling’74 (United States), ENSAD, ENSCI, GRAME, HKU (Netherlands), Hôpital Pitié-Salpêtrière, ICK Amsterdam (Netherlands), IEM (Autria), ISIR-CNRS Sorbonne Université, Little Heart Movement, Mogees (United kingdom/Italia), No Design, Motion Bank (Germany), LPP-CNRS université Paris-Descartes, université Pompeu Fabra (Spain), UserStudio, CRI-Paris université Paris-Descartes, Goldsmiths University of London (United kingdom), université de Genève (Switzerland), LIMSI-CNRS université Paris-Sud, LRI-CNRS université Paris-Sud, Orbe.mobi, Plux (Portugal), ReacTable Systems (Spain), UCL (United kingdom), Univers Sons/Ultimate Sound bank, Universidad Carlos III Madrid (Spain), université de Gênes (Italia), université McGill (Canada), ZhDK (Switzerland).
Research topics and related projects
Acoustic instruments that have been fitted with sensors
European and national projects
Decentralized platform for fair creative content distribution empowering creators and communities through new digital distribution models based on digital tokens
Distributed Music Objects for Collective Interaction
Stimulate Movement Learning in Humain-Machine Interactions
Musical Improvisation and Collective Action
Analyse de la qualité audio pour représenter, indexer et unifier les signaux