Research
News

New Applications for Gesture/Sound Applications

Interview with Frédéric Bevilacqua

Research director at IRCAM where he is also the head of the Sound Music Movement Interaction team, Frédéric Bevilacqua works on gestural interaction and movement analysis applied to music and live performance.

Known for his research on musical interfaces and the development of new digital musical instruments for which he has received numerous innovation awards in the last decade, Frédéric Bevilacqua and his team have recently taken a new step forward by developing the CoMo open-source web ecosystem, which enables a new generation of multi-modal and collaborative tools; easily accessible. This opens up a whole new field of applications for research on sound interaction, in fields such as education and health, which were addressed in the ANR Element project ending in 2022.

To learn more about the new challenges of sound interaction, we met the researcher in his office at the Sciences and Technologies of Music and Sound laboratory at IRCAM.

Frédéric, you specialize in interactive musical systems. Could you explain your research to us?

My team and I are interested in the possibilities of integrating gestures and the body in our interactions with computers, particularly computer music. We propose new forms and innovative tools for listening and playing music taking the concepts of embodied interaction into account.

This includes on the one hand the possibility of using tangible interfaces or motion sensors, to trigger, transform, and explore sound spaces. Through neuroscience studies, we know that our perception and action are intimately linked, therefore these types of methods used for playing sounds and music could be considered as active listening.

On the other hand, we are also interested in the aspects of collective and social interaction, which are entirely part of the concept of embodied interaction: our environment and context must be taken into account. The experience of music—from listening to performance—is gradually built from collective and social interaction.

In ten years, your team has gone from designing hardware prototypes called Modular Musical Objects (MO) to developing web platforms and online applications (CoMo). How do you explain this shift in direction?

The Modular Musical Objects (MO) had an important impact and have been used in many artistic creations in different forms and aesthetics. They have also received international recognition by winning the Guthmann Prize for New Musical Instruments and being included in an exhibition at the MoMA in New York: Talk to Me.


Examples of Modulars Musical Objects © NoDesign

However, these new "digital objects" created in collaboration with NoDesign were prototypes that were difficult to duplicate and maintain, due to the technologies we were using a decade ago. Later, we made a step forward by partnering with PluX to commercialize the RIoT-Bitalino motion sensors, initially created by Emmanuel Fléty at IRCAM. But an important turning point was probably the use of web technologies that allow us to connect a large number of devices, including our cell phones, which was initiated in the CoSiMa project.

MOs were innovative musical instruments. But what does CoMo mean?

Our objective is to be able to implement large-scale collective and collaborative interactions. The interest of web technologies lies namely in their ability to provide standards that make it possible to connect many sensors and hardware of all kinds—from nanocomputers to laptops—even if they are not on Internet directly. The hardware and software infrastructure is adapted to create interactions with tens or hundreds of "objects" connected to a wifi network. We have upgraded the Musical Objects (MO) to "Collective Musical Objects" that we call CoMo.

CoMo is in fact a collection of applications that all use the same software base developed by Benjamin Matuszewski, with contributions from Jean-Philippe Lambert and Jules Françoise. These applications are used to connect motion sensors, motion analysis and recognition modules with sound synthesis.

  • The most generic application is CoMo-Elements which can be used with cell phones, taking advantage of the built-in motion sensors found in all smartphones. CoMo-Elements can associate sounds with postures and movements. It is commonly used for musical pieces and workshops, especially with dancers.
  • CoMo-Education, conceived by Marion Voillot (designer, and doctoral student in our team and at the CRI Université de Paris Cité), is a specific version intended for schools. This application lets you tell stories in motion, generating a sound universe that is "played" by the children.
  • CoMo-Réducation is an application designed during Iseline Peyre's PhD thesis, for a home-based self-education program with interactive music for stroke patients
  • Finally, CoMo-Vox is an application developed in partnership with Radio France for learning simple choral conducting gestures.

Photo: CoMo-Rééducation, dispositif de sonification du mouvement d’auto-rééducation (thèse d’Iseline Peyre). credit photo © Mikael Chevallier

Your team has also multiplied artistic collaborations in recent years, for interactive installations (Biotope by Jean-Luc Hervé, Square by Lorenzo Bianchi), collective performances (by Chloé, Michelle Agnes Magalhaes, Aki Ito, Garth Paine) as well as cultural outreach activities for young audiences (Maestro, Maesta at the Philharmonie des enfants, which allows children to put themselves in the shoes of a conductor). What lessons does science draw from these musical experiments?

This is an interesting question, because I don't think we can talk about science in the singular. We are really at the junction of several ways of conceiving scientific research, of knowledge that concerns technology but also the human being. We are truly at the crossroads of computer science, cognitive science, design, and human and social sciences all of which have different methodologies. We study, for example, how we can learn or relearn movements guided by sounds and how to set up technical devices to implement it. But our devices are not just technical applications, we are interested in the questions of appropriation, usability, and creativity that these devices stimulate. Each situation implies different contexts that we have to take into account, and that are in themselves research fields. Moreover, we are increasingly interested in the ethical and sustainability issues involved in our research.


Composer Michelle Agnes Magalhaes, and the Ensemble soundinitiative, Constella(c)tions performance at IRCAM, 2019 © Valentin Boulay

Today, new fields of application of your research are opening up in the fields of education and health. Why have other sectors become interested in the interaction between gesture and sound?

Learning is shaped by our perceptions, actions, and emotions. Many educational methods suggest taking into account the body, movements, and collective interactions. Many of the scenarios we have imagined are used to learn to be attentive to others, to learn movements, or simply to develop listening skills. CoMo-Education, which gives you the opportunity to tell stories in movement and music, is a very good example that was designed in collaboration with teachers. Concerning health: sounds and music can be used to create playful and motivating applications which seem to us to be one of the keys for home rehabilitation. But many other applications are possible. For example, we have been approached to work with people with disabilities.

The recently completed ANR Element project carried out with LISN, the CNRS and Paris-Saclay promoted the development of more complex and expressive human-computer interactions in motion-based interfaces. Which fundamental research questions did the project address?

We continued our research on the notions of learning and proficiency of digital interfaces. How we can learn to master new interfaces, in particular gesture interfaces. The notions of intuitiveness are often invoked in the industrial world as a selling point, but they are not that simple to put in place. One of the fundamental questions that drives us is how to adapt each interface to the needs of the users. It is about understanding the various user communities, their needs and abilities, and designing interactions and tools that can accommodate different situations.

What were the scientific advances that it brought about? And its concrete applications?

On the one hand, we are developing the notion of "gesture and movement design", i.e. developing methodologies and tools for gestural interactions. This can be seen as a new discipline, similar to sound design that developed several years ago. We develop tools that allow users to define their own gestures according to the context and their abilities. All of the CoMo applications we have discussed represent concrete applications from the ELEMENT project and are based on this approach. Our partners (LISN and ISIR) have also developed a new platform for machine learning called Marcelle, which gives users the possibility to build and test shape and motion recognition models collaboratively. Our philosophy is to provide a wide range of possibilities for users to appropriate, control, and modify the contents of machine learning systems.