Digital cultures are increasingly pushing forward a deep interweaving between human creativity and autonomous computation capabilities of surrounding environments, modeling joint human-machine action into new forms of shared reality involving ""symbiotic interactions” found in the arts and, more generally, in almost any human endeavor. Co-creativity between humans and machines will bring about the emergence of distributed information structures, creating new performative situations with mixed artificial and human agents. This will disrupt known cultural orders and significantly impact human development. Generative learning of symbolic representations based on physical and human signals, as well as the understanding of artistic and social strategies of improvisation, will help us to better comprehend the dynamics of cooperation (or conflicts) inherent to cyber-human bundles.
To this end the REACH project aims at understanding, modeling, and developing musical co-creativity between humans and machines through improvised interactions, allowing musicians of any level of training to develop their skills and expand their individual and social creative potential. Indeed, improvisation is at the very heart of all human interactions, and music is a fertile ground for developing models and tools of creativity that can be generalized to other activities, as in music the constraints are among the strongest to conduct cooperative behaviors that come together into highly integrated courses of actions. REACH will study shared musicianship occurring at the intersection of the physical, human and digital spheres as an archetype of distributed (natural / artificial) intelligence, and will produce models and tools as vehicles to better understand and foster human creativity in a context where it becomes more and more intertwined with computation.
REACH is based on the hypothesis that co-creativity in cyber-human systems results from a an emergence of coherent behaviors and non-linear regimes of event and structure formation, leading to a rich co-evolution of musical forms. These phenomena result from cross-learning processes between agents involving feedback loops and complex reinforcement mechanisms. REACH will study these mechanisms in vivo and in vitro, and will produce creative tools through the convergence of methods from research in interactive computational creativity, artificial intelligence and machine learning, social sciences with the anthropology of improvised practices (collaboration with CAMS at EHESS), and instrumental mixed reality systems (collaboration with the company HyVibe).
IRCAM's Team : Musical Representations