Gesture analysIs and RecognItIon
This research project replies to an increased interest for interactive musical systems based on gestural control. The applications concern not only music, but also performances such as dance or theater. The research carried out in the framework of this project is multi-disciplinary and includes engineering, physiology and biomechanics, cognitive sciences, and artistic domains. This work is carried out in synergy with the team’s developments on gestural interfaces.
Project Description & Goals
The first axis of research focuses on the study of instrumental gesture and its relationship with both musical writing and the characteristics of the sound signal. Measurements were taken on violinists and trumpet players during a collaboration with McGill University. In the case of string instruments, the movements of the bow, the violin, and the musician’s entire body can be measured using optical 3D measurement techniques. The team has also developed complementary systems for capture, compatible with use on stage or in educational settings.
The ensemble of these methods makes it possible to measure and model musicians’ gestures. Diverse issues are also addressed in this study: motor control, learning in the case of the gesture of an expert, characterization of playing styles taking into account sound and gesture parameters, and the modeling of the phenomena connected to the gestural co-articulation similar to those of speech.
A second axis concerns the development of systems for gesture analysis and recognition. The team is favorable to general approaches that make it possible to apply their results to music, dance, or to interactive installations. Such systems could be included in the context of a live performance for the synchronization
and interaction between the performers’ movements and a broad range of sound or visual processes. Recognition is based on a variety of temporal characteristics of movement, captured either by video, either by sensors attached to the performers’ bodies. The tools developed during this research, such as "gesture
follower", make it possible to accurately recognize and characterize diverse high-level gestural elements defined by the artists.
IRCAM's Team: Sound Music Movement Interaction team.