Personal tools

The Science and Technology of Music

Sections

Research Project: Compositional applications of auditory scene synthesis in concert spaces via virtual environments

Back to Research Field: Music Perception and Cognition

Compositional applications of auditory scene synthesis in concert spaces via virtual environments

  Research Project Information
Runtime: Jan 03, 2006 until Jul 29, 2009
Student staff: Joseph Malloch (McGill, Music Technology); Georgios Marentakis (McGill, Music Technology); Mark Marshall (McGill, Music Technology); Nils Peters (McGill, Music Technology)
Funding: Natural Sciences and Engineering Research Council / Canada Council for the Arts, New Media Initiative

Novel compositional and technological methods for the advanced use of the multidimensional nature of auditory space in music composition were developed, including a 24-loudspeaker auditory virtual environment (AVE). The design of the AVE was based on a 24-channel loudspeaker system and virtual microphone control for use in live electroacoustic concerts. The system was used to deliver the spatial aspects of compositions accurately over a large listening area. Within this project, the way to best control the parameters of the AVE via gestures of the musicians were also investigated. This way, the musicians were free to interpret the spatial aspects of the score. New and innovative Acoustic Scene Synthesis methods were developed to allow use to fuse and segregate acoustic objects into different auditory streams. At the end of the project, the newly developed methods were demonstrated in compositions for a small music ensemble.