[CANCELLED] Christopher Dobrian: Expressive gesture: A technique for the use of gesture descriptors in algorithmic improvisation

This seminar is presented in collaboration with Research Axis 2 (Music information research).

CANCELLATION NOTICE: This event has been cancelled due to unforeseen circumstances.

 
Chris Dobrian seminar

 Room E109 is located in the basement of the Strathcona Music Building behind Vinh's Café.

ABSTRACT

Music often conveys ­­­­­a sense of “gesture,” an evocation of motion and energy, which makes it dramatic, exciting, and expressive. One common challenge in the production of algorithmically-generated computer music is the question of how to imbue the sound with the excitement and vitality of live performance. In the case of interactive computer music, one has the additional challenge of programming the computer to interpret the expressive qualities of music being performed in real time. 

This lecture presents an approach to automatically analyzing and characterizing gesture in musical sound, as a way of improving a computer’s interaction with a human performer in a live improvisation. By describing music as patterns of changing parametric data, the computer can store and categorize descriptors of musical gestures. As an extension of that research, we can then consider how derivatives of that analysis data—the ways in which the data changes over time—characterize the gestural quality of a performance. In the algorithmic generation of music, control of those derivatives of change in musical parameters can improve the expressive potential of computerized improvisation.

BIOGRAPHY                                                         

Christophe DobrianChristopher Dobrian is Professor of Music, with a joint appointment in the Department of Informatics, at the University of California, Irvine. He is a composer of both instrumental and electronic music, and teaches courses in composition, theory, and computer music. He is the co-founder and director of the doctoral program in Integrated Composition, Improvisation, and Technology (ICIT), and directs several music technology laboratories at UCI, including the Music Collaboration Laboratory (ColLab), the Realtime Experimental Audio Laboratory (REALab), and the Gassmann Electronic Music Studio. He conducts research on the development of artificially intelligent interactive computer systems for the cognition, composition, and improvisation of music, has published technical and theoretical articles on interactive computer music, and is the author of the original reference documentation and tutorials for the Max, MSP, and Jitter programming environments by Cycling '74. He holds a Ph.D. in Composition from the University of California, San Diego, where he studied composition with Joji Yuasa, Robert Erickson, Morton Feldman, and Bernard Rands, computer music with F. Richard Moore and George Lewis, and classical guitar with the Spanish masters Celin and Pepe Romero. 

Dobrian has been an invited Fulbright specialist at the Korean National University of Arts, the University of Paris-Sorbonne, and McGill University in Montreal, and has been a guest professor at Yonsei University, National Taiwan Normal University, Paris 8 University, and the National University of Quilmes in Argentina. His computer music compositions include Microepiphanies: A Digital Opera, a completely computer-controlled performance; Invisible Walls for dancers, motion tracking system, and computer-controlled synthesizer; Distance Duo for two computer pianos in remote locations connected via Internet; Mannam for Korean flute (daegeum) and interactive computer system; JazzBot for piano and musical robots; Tautology for Two for trumpet, trombone, and computer, with the instrumentalists located in different cities; and Gestural for digital piano and interactive computer system responding to the musical gestures of an improviser.