The Greek journalist Thanos Madzanas met with a member of the General Chair, Gérard Assayag, in the midst of preparation for the event. A researcher at IRCAM and head of the institute’s Musical Representations team, he explains the new and emerging links between music, improvisation, and artificial intelligence. What are improvised interactions and human-machine co-creativity? What are their impacts on musical creation and, more generally, human experiences?
What made IRCAM start IMPROTECH in the first place? Was it mainly because of scientific, educational, musical reasons or all those together?
IRCAM is a unique place for research and creation in music, and ImproTech is the right mixture of both: it brings together scholars, researchers, musicians, makers, around the idea of musical improvisation in interaction with digital intelligence and creates a link connecting European and worldwide research with creation scenes through the symbolic matching of Paris—where the event originated at IRCAM—and another city emblematic for its cultural influence.
How do you decide with which city you will collaborate every year? What made you choose New York, Philadelphia, and then Athens as the first European city where IMPROTECH will be held?
The opportunities for collaborations: New York University, Columbia University, University of Pennsylvania, University of Athens, Onassis STEGI are all big cultural or educational centers that have expressed interest in our project and have offered infrastructure and support. We are proud that Athens is, in effect, the first European city to enter the series. It is a beacon in human history and Greek culture has had a fundamental impact in the relation between art and rationality.
IRCAM and Onassis Cultural Center have a longtime collaboration on many levels, they are even both parts of the same European network. But what made you choose the Department Of Musical Studies of the University Of Athens (UOA) as the other co-organizer of Athens IMPROTECH?
There are also long-standing collaborations between IRCAM and UOA, in particular through Anastasia Georgakis, head of the laboratory of music acoustics and recording who was a PhD researcher at IRCAM. This cooperation has resulted in common workshops, teaching, student exchanges, and now Improtech.
Onassis Cultural Center, Athens
IRCAM always was at the forefront of general research and particularly that related to technology as for both the possible forms and the sound too of music. So guess that you are interested and working on artificial intelligence and how it can be applied to music for quite long?
There are many ongoing projects related to AI at IRCAM and the STMS lab (IRCAM’s research lab is a national joint research unit operated by IRCAM, CNRS, and Sorbonne Université) has been a pioneer in machine musicianship. That is, artificial creative intelligence for music understanding and creation. AI in music is even one of the themes of Improtech in Athens, with a lecture session on Friday, September 27 on "Algorithms, AI and Improvisation".
How, and up to what degree, do you think can AI affect both the concept and the process of improvisation? Besides the scientific and maybe the musicological interest, do you think that it has any real musical value, does it really help the creative process and the actual making of music?
My team—Musical Representations at IRCAM—and its partners at EHESS and UCSD, have come-up with the concept of human-machine co-creativity, which in a way answers that question. Co-creativity is an emergent phenomenon, that appears when complex cross-feedback loops of learning and generating between natural and artificial agents occur. As a complex system, its properties are not reducible to the properties of the agents in interaction.
In that sense, there is no longer any point in wondering about the difficult philosophical question of knowing if an artificial creativity is really possible as what matters is how creative effects emerge from complex interaction involving human as well. AI is of great help for that matter as it provides methods for artificial listening, structure discovery, generative learning, but smart interaction architecture also have to be devised in order for co-creativity to manifest itself.
More generally speaking, do you believe that AI can not only make easier and more efficient but also enhance the making of music and even the whole musical experience as we make it happen and perceive it as human beings? If yes, in what ways and how much?
Co-creativity between humans and machines will bring about the emergence of distributed information structures of a new kind that will profoundly impact individual and collective human development. With technologies allowing one to extract semantic features from physical and human signals, combined with generative learning of high-level representations, we are beginning to unveil the increasing complexity of cooperation, synergy or conflicts inherent to cyber-human networks. By understanding, modeling, and developing co-creativity between humans and machines in the musical domain, improvised interactions will allow musicians of any level of training to develop their skills and expand their individual and social creative potential.
Using computers—just like other electronic sound sources—as tools is one thing but with AI computers stop being tools and begin to be creators of music themselves and on their own. Is that something that humanity at the end of the second decade of the twentieth first century does really want or not?
As I said, we are beyond the question of artificial creativity, with a much more powerful and humane concept of co-creativity. Nothing makes sense without humans, and so we’re not in some sort of transhumanist dream (or nightmare).
A musical piece of work created and made entirely by AI would be exactly the same to you as one composed by a human being? Would it hold the same artistic and aesthetic values and eventually can it be as good (or bad of course) as that created by the human?
By definition if it were the same, it would be the same.
But in art in general, there is not only the crafting of the artefact, there is the way it is dropped into the social and anthropological context by the artists, which makes part of the historical impact. A machine could be able to design the urinal of Marcel Duchamp, but would it be smart enough to create and manipulate the provocative context in which the actual work took his disruptive meaning in the history of art ? Duchamp’s genius was not displayed in the mere form or materiality of the object.
Both personally and as a member of the IRCAM team are you afraid that at some time in the future computers and AI will totally replace humans as the creators of music? Do you think that it is really possible that at some point one or more algorithms will compose music without any human intervention but under commission or even very strict guidelines from some who would benefit from controlling human creativity and at the end even humans themselves?
It is absolutely possible for bad, utilitarian commercial music, but how much worse would it really be compared to what we are exposed to now on mainstream medias? There will always be creators of original music, even if they are in minority, and creators are never afraid of integrating novel technologies.
Which ones would you say that are the more important and interesting too of all the scientific/educational participations to Athens IMPROTECH and why?
They are all fantastic!
But to get a quick idea of a domain, the keynotes presentation should be a must.
And respectively, which ones of the performances do you consider to be more interesting and maybe even look forward to attend them yourself?
The 3 concerts will be free, with each evening a fantastic collection of artists from all around, and the right balance of young artists and world-class headliners, none to be missed.
Interview by Thanos Madzanas, Athens based music writer, music critic, journalist and novelist.
It was first published, translated in Greek, in the Blogs section of HuffPost.gr as part of a long piece about Athens IMPROTECH ’19.