MIM

Enhancing Motion Interaction through Music Performance

This project focuses on Human-Computer interactions based on movement, leveraging a multidisciplinary approach between experimental psychology, music technology, and computational modeling. Initially, the project will look at sensorimotor learning mechanisms and expressive control in human movement.

Project Description & Goals

Computational models of these mechanisms will be developed based on experimental data gathered from the performers’ movements. Then, the models developed will be applied to the domain of Digital Musical Instruments (DMI), creating new types of instruments based on sensorimotor learning mechanisms. The project contributes to two fairly uncharted research areas. Firstly, it contributes to the fundamental understanding of sensorimotor learning processes by considering complex human motion such as the movements of musicians. Secondly, it represents the development and assessment of unique interactive musical systems using computational models of expressive musical gestures.

Project details

Program
H2020
Program type
Marie Sklodowska-Curie
Beginning
Jan. 1, 2016
End
Dec. 31, 2018
Status

Participants

Project lead organization
  • logo Ircam
  • logo mcgill