An afternoon of seminars

With René Rusch, McGill University; Aaron Liu-Rosenbaum, Université Laval; and Ilja Frissen, McGill University.

  • 14:00-14:35 - René Rusch, McGill University: Figured-bass Patterns and Their Voice-Leading Tendencies in Bach's Four-Part Chorales

Abstract: The ELVIS (Electronic Locator of Vertical Successions) research team is currently studying contrapuntal patterns between voice pairs in a music corpus of over 5,000 pieces from 1300–1900, using the Python/music21-based software program VIS (Vertical Interval Successions), as a means to explore style change over time. Drawing from the work of ELVIS, my research examines figured-bass patterns and their voice-leading tendencies in a relatively modest corpus, J. S. Bach's chorales from the Albert Riemenschneider edition (1941). My presentation will provide an overview of the project in light of recent computational analyses of the Bach chorales (Quinn and White 2013; Quinn and Mavromatis 2011; Conklin 2001 and 2002) and will compare the results from these conference talks/publications to both the preliminary results achieved through VIS's Counterpoint Web Application and my analyses completed by hand. The presentation will also outline the project's current challenges and consider directions for further exploration.

Biography: René Rusch is assistant professor of music theory at the Schulich School of Music, McGill University. In addition to specializing in the music of Franz Schubert, Rusch’s research interests include 19th-century chromaticism, Schenkerian theory, and jazz theory. Her work has appeared in the Journal of Music Theory, Music Analysis, Music Theory Online, Journal of the Society for Musicology in Ireland, and Intersections. Rusch is currently serving on the SMT-Jazz Award Committee and on the editorial board of Intégral.

 

  • 14:40-15:15 - Aaron Liu-Rosenbaum, Université Laval: Merging art, science, and hearing loss in an interactive sound installation

Abstract: As an educator in the field of music production, I became concerned with the high volume levels at which my students habitually listened to their creations. I therefore sought a way to integrate this concern into my own research-creation work. The result was a pilot project I and a team of students developed called "filtres," which was a sonic portrait of Quebec City in the form of an interactive sound installation whose field recordings were filtered through different physiologically accurate hearing loss simulations. An integrated e-survey collected visitor feedback to give us an idea of the efficacy of the installation as a means of raising public awareness of hearing loss in an artistic context. I will discuss the process of realizing the installation, explain some of the challenges we faced in this art-science cohabitation, and review possible ramifications of the visitor data we received.

Biography: Composer and music technologist, Aaron Liu-Rosenbaum is Director of the Certificate Program in Digital Audio at Laval University, where he teaches courses in music technology, recording, digital audio production, and sound design. He received a BA in French Comparative Literature (1990, Columbia University), a BMus in Classical Composition (1994, New England Conservatory), an MA in Music Theory (1996, Columbia University), and a Ph.D. in Composition (2009, CUNY Graduate Center). In addition to playing rock guitar in the New York City circuit, he has studied composition with Robert Cogan, David del Tredici, and Tania León, and his music has been performed at venues in the United States and Paris. His research interests lie in the areas of technology and pedagogy, popular musicology, and the rapport between noise and culture. He is current developing several interactive sound installations as research-creation projects.

 

  • 15:20-15:55 - Ilja Frissen, McGill University:  The role of the haptic system in music perception

Abstract: Music perception is not a purely auditory event; the body comes into play at a very fundamental level as well. For musicians the body is the interface with their instrument but can also serve as a tool to support music production (e.g., tapping your foot along with the beat). For listeners the body more often than not becomes an expression of the realization of the music. In previous work we have shown that voluntary movements can affect how we hear things. The question I am interested in here is to what extent the involvement of the body, that is the haptic system, interacts with auditory perception within the context of music perception. In the first part of this talk I will highlight relevant previous work in order to set the stage for the second part in which I will introduce the general goal for my work at CIRMMT.

Biography: Ilja Frissen studied cognitive psychology at Maastricht University (Netherlands) and obtained a PhD in experimental psychology from Tilburg University (Netherlands). He completed three postdocs. The first was at the Max Planck Institute for Biological Cybernetics in Tübingen (Germany) where he worked on the CyberWalk project within a European consortium that successfully developed an omnidirection treadmill that enables the user to walk through large scale virtual environments in a natural and unconstrained manner. His second postdoc was at the Multimodal Interaction Lab at McGill University. He was also a student member of CIRMMT. His third postdoc was at the Institut de Recherche en Communications et Cybernetique de Nantes (France) where he worked on human-machine interactions within the context of driving. Ilja’s research is mainly concerned with how the various senses interact with each other and how they interact with the motor system.