Automatic Music Indexing

Automatic extraction of musical descriptors for a piece of music

During projects presented hereinafter, the following subjects were addressed:

  • Methods for the automatic extraction of musical descriptors for a piece of music such as the tempo, location of beats, metrical, tonality, or a temporal grouping for a chord. These descriptors facilitate the automatic classification of a piece and can be used for content-based searches in sound databases.
  • Musical excerpt recognition methods, designed to automatically identify excerpts from pieces of music using reference databases. These methods are based on a compact sound signature (fingerprint) encoding the essential information. These algorithms compare each fragment of sound under investigation, with those in the database.
  • Methods for the estimation of the temporal structure of a piece of music in terms of the repetition of a section being listened to and enabling browsing within the temporal structure of the given musical piece.
  • Methods for the automatic creation of audio summaries making it possible to quickly pre-listen to the contents of a given musical piece via its key points

IRCAM's team: Sound Analysis & Synthesis team.

  • logo Ircam