As director of research at IRCAM and manager of the Sound Perception & Design team of the STMS lab, Nicolas Misdariis co-directed Claire Richards’ PhD thesis, during which she developed her multimodal harness, and is now supporting Alberto Gatti in his artistic research residency. In this episode, Misdariis tells us more about the stakes of this project, and shares his own expectations.
What role did you play in Claire Richards’ thesis work and the development of the multimodal harness?
Claire developed the first version of the harness as part of her PhD thesis work, which I supervised jointly with Roland Cahen. Us being both involved in her thesis just shows the multidisciplinary dimension of her research, located at the intersection between design, sound sciences and techniques, as well as sound perception and design (which is the exact name of our research team). From the very start, we had considered potential industrial or artistic applications of this research. We welcomed Alberto Gatti in our team for our first artistic collaboration together during the last year of Claire’s PhD.
What type of industrial applications were you thinking of?
We discussed the idea that this research could be used for the development of a human-machine interface (HMI). Together with composer and sound designer Andrea Cera, who had already worked with our research team before on a project that involved IRCAM and the car manufacturer Renault, we thought of a new way to distribute the information provided by a car’s on-board computer, for instance through the driver’s seat. This idea led to the development of an HMI demonstrator associating sound and haptic technologies which was installed in a “demo-car” (a prototype intending to showcase what could be the future of the car industry) called Symbioz – which had gone through extensive testing. The idea was to supplement the visual information provided by the on-board computer (concerning the state of the vehicle – indicators, parking sensors, belt fastening – or its environment – traffic lights, speed limits, outside temperature) with audio-haptic information. In other words: a multimodal HMI. This innovation could be a way to make the HMI less intrusive for the other passengers – some car manufacturers have already started to seriously contemplate it. In the case of autonomous driving, when the driver’s visual and auditive attention is already focused on listening to music or watching a video, for instance, haptic technology could be a way to make sure the driver gets the necessary information. Another project is also being considered for the aviation industry…
Has the Sound Perception & Design team already been involved in industrial projects of that kind?
Yes, but they did not specifically involve multimodal technologies, except for Aurélie Frère’s research. Her thesis, which she conducted in collaboration with the car industry and defended at IRCAM in 2011, focused on the multimodal (audio and vibratory) perception of a diesel engine and the impact of these sound disturbances on the driver, for diagnostic, mitigation and improvement purposes.
Regarding applications that combine industrial and artistic purposes, could that include the film industry?
Absolutely! It’s already the case in a number of cinemas, where the seats are equipped with actuators and sensors – what film distributors call “4D”. But it doesn’t work exactly the same way our multimodal harness does, even if we could imagine using it as part of an audiovisual experience. However, we would then be confronted with the difficulties inherent to film production; the audiovisual signal would need to be coupled with a vibratory channel from one end to the other of the whole production chain: from filming to post-production and finally the screening of the film, for which the audience would be provided with a specific equipment. It’s a heavy process that is difficult to imagine implementing today.
As for potential artistic applications: what appealed to you in Gatti’s project?
We received many different propositions, but the selection committee voted for Alberto’s, for its specific qualities but also because it would allow us to continue the work already begun as part of Claire’s PhD. Alberto’s project also implied the exploration of instrumental gesture and its transmission through the multimodal device, a dimension that could help the Sound Music Movement Interaction team in their work. Gatti’s artistic research residency is therefore co-directed and co-supported by both Frédéric Bevilacqua and I.
What are your expectations, in terms of scientific breakthrough?
When we reached the end of Claire’s PhD, the compositional exploration of Alberto’s device had led to three proposals. It was a very promising work both from an artistic and a scientific standpoint, because it would help us test the viability of the digital interface that we were in the process of developing. Claire’s research already included the development of an interface, because without a tool dedicated to monitoring, conceiving, composing and creating, the harness would have been completely unusable. By conducting his (artistic) experiments, Alberto will greatly contribute to this project.
I am also curious to see what will come from another of Alberto’s great ideas: exploring the communication between musician and listener. In the last months before Claire’s final presentation of her thesis, we had already tried to transfer the sound recorded from a cello with a contact mic through the harness. The result was striking! You can really “feel” the instrument’s body with your own.
Interview by Jérémie Szpirglas