OMax

OMax combines real-time interaction (e.g. Max) with high-level musical representations (e.g. OpenMusic) to create a 'clone' based on the live performance of a musician. Using the techniques from machine learning and formal languages OMax learns from a MIDI flux, or even from an audio flux produced by a musician, in an unsupervised manner.

The underlying process for this interaction can be called "stylistic re-injection". The musician is constantly informed by several sources creating a complex feedback. He hears himself play and he hears the others as they are playing while memorizing the sound images that drift from the present toward the past. From mid to long-term memory, these motifs combines with even older images (repertoire, learning) can return after several transformations, one of the most common in improvisation is re-combination.

In some ways OMax models this kind of memory process and facilitates the "reifying" of the process, making it heard. It therefore re-injects musical figures from the past (immediate, or distant) performance as a reconstruction that is both similar and innovating, and provides the musician with stimuli that are familiar and provocative.

Practiced improvisers generally react to their "clone" in a creative way and, experience has proven, with special interest in a "musical subject" that inspires and upsets their habitual playing style by pushing them to adapt to a new situation.

Top-level improvisers (e.g. Bernard Lubat and Mike Garson, taught by Bill Evans) have tested OMax and have provided precious feedback making it possible to fix the necessary heuristics well beyond the formal model. These sessions can be seen at: repmus.ircam.fr/omax/home

Technically speaking, a Max component "listens" to the musician, extracts the high-level descriptors, segments the events, and provides a continuous flux of this information to an OpenMusic component. This component incrementally constructs the model while continuously generating "improvisations" with a concurrent architecture. A management interface makes it possible to adjust the browsing parameters in the session's memory, from the immediate past to the distant past, even, if we use stocked archives, a collective past involving other situations and other musicians. This use gives rise to phenomena of hybridization.

Partners: EHESS, UCSD (San Diego, USA), Cnsmdp

Participants

Ircam Teams : Musical Representations
Back
S M T W T F S
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30        
             
Previous monthPrevious dayNext dayNext month

The french Gateway to Contemporary Music ResourcesThe french Gateway to Comtemporary Music Resourcesclose

Veuillez installer Flash pour afficher ce lecteur.