
Topophonie
Topophonies are virtual navigable sound spaces, composed of sounding or audio-graphic objects. Graphic and sounding shapes or objects are audio-graphic when visual and audio modalities are synchronized. In virtual reality and videogames, we know how to make scenes composed of point-shaped elements: graphic and sound (i.e. a spot representing an object). However, there is no tool enabling navigation to make scenes consisting of very great numbers of interactive visual and sound elements. Nor dispersed elements such as in a crowd, a flow of traffic, foliage, or rain.
The research project Topophonie proposes lines of research and innovative developments for sound and visual navigation in spaces composed of multiple and disseminated sound and visual elements. By working in a scientific multidisciplinary group (digital audio, visualization, sound design) with enterprises specialized in the domain of interactive multimedia activities, the project Topophonie is going to conceive and develop models, interfaces and audio-graphic renderings of groups of granular animated and spatialized objects. The project team is composed of researchers specialized in granular sound renderings and in advanced interactive graphic renderings, as well as digital designers and enterprises specialized in the relevant fields of application.
This research will take place in three phases: first study and experimentation to define generic models and realize prototypes functioning on the basis of demonstration models, secondly a phase for conceiving the necessary components to implement the applications to be developed later in the third phase. The completed works will produce definitional interfaces to control multimedia scenes and tools for real time rendering on synchronized audio and visual channels. The scientific interest lies in the attempt to define a model of generic data, the realization of an interface for the efficient and fine definition of large volumes of non-homogenous audio and graphic objects, as well as in the smooth and interactive rendering of moving objects. These developments will be realized on three scenarios (scenes) with varying degrees of detail and behavior. For example, detailed rainfall, movement through foliage, and moving though and over a crowd of people.
In addition to their association with the conception and development of models supporting scientific research, the industrial partners will develop applications for interactive cartographic and position-determined mapping navigation as well as interactive virtual universes such as urban plans or videogames, special effects for audiovisual and multimedia environments for digital and museum art installations.
Project Details
Program
ANR / Cap Digital
Program Type
Contents and Interactions
Start Date
January 1, 2009
End Date
December 31, 2012
Statut
Execution































The french Gateway to Comtemporary Music Resources