Automatic Music Indexing

During the SemanticHIFI, MusicDiscover, Ecoute and Quaero projects, the following subjects were addressed:
- Methods for the automatic extraction of musical descriptors for a piece of music such as the tempo, location of beats, metrical, tonality, or work temporal chord sequences. These descriptors facilitate the automatic classification of a piece and can be used for content-based searches in sound databases.
- Musical excerpt recognition methods, designed to automatically identify excerpts from pieces of music using reference databases. These methods are based on a compact sound signature (fingerprint), encoding the essential information. These algorithms compare each fragment of sound under investigation with those in the database.
- Methods for the estimation of the temporal structure of a piece of music in terms of the repetition of a section being listened to and enabling browsing within the temporal structure of the given musical piece.
- Methods for the automatic creation of audio summaries making it possible to quickly pre-listen to the contents of a given musical piece via its key points.
Participants
Ircam Teams : Sound Analysis-Synthesis

























The french Gateway to Comtemporary Music Resources