Brice Gatinet is a French composer currently living in Montreal. Along his musical path, he has discovered many ways to express unique musical ideas, infusing his works with elements of jazz, improvisation, death metal and classical. These influences are at the heart of his writing and musical thought, where technique, poetry and structure are intimately linked to create a personal expressive dynamic. In France, Gatinet studied musicology at the Grenoble University, as well as Jazz and Musique Actuelle at the Chambery conservatory. Since moving to Montreal, he has obtained a Masters in Composition at the Université de Montreal, and he is currently completing a doctorate in Music Composition in McGill University under the direction of Philippe Leroux. In 2016, he received a funded three-month residency at the Casa Velasquez in Madrid, and in the next year, he has already committed to commissions by Orchestre Symphonique de Montréal and le Nouvel Ensemble Moderne among others.
2019.20 Artistic Research Residency
Development of software component using participatory design to generate musical materials using Artificial Intelligence.
In collaboration with the Musical Representations IRCAM-STMS Team.
During my residency, I will create a large-scale piece for piano, electric guitar, ensemble and electronics using software tools designed especially for this project by ACIDS team and myself. These softwares will focus mainly on Human/AI creative partnership as improvisation and machine learning. My goals for this residency are to innovate novel modes of expression, provide exploratory tools for musicians, and create companion systems based on Artificial Intelligence to help human creativity.
This project uses a participatory design method to explore and augment research already taking place within different axes of the ACIDS team. This team based in IRCAM is specialized in Artificial Creative Intelligence and Data Inference. In my work, I intend to engage directly with research oriented around several approaches, including learning-based inductive orchestration, orchestral waveform generation, co-improvisation and learning. The outcome will be used to create orchestration based on a piano score, using real-time improvisation from the computer during the live performance to generate a tape part based on a specific dataset encompassing piano and electric guitar sounds. These results will be harnessed for the creation of a large-scale piece for piano, electric guitar, ensemble and electronics. Broadly speaking, my work will be realized in three phases: 1) an analysis of different on-going projects being undertaken by the ACIDS team in order to ascertain the needs and expectations of the distinct software to be developed during my residence at IRCAM, 2) a conceptualization and implementation of specific prototypes for software tools, and 3) a post-creation evaluation to measure the usability of the resultant product.