An afternoon of seminars

With Rosemary Mountain, Concordia University; Luis Rodrigues, Concordia University; Ian Sinclair, MPB Technologies; Michael McGuffin, Ecole de technologie superieure (ETS); and Audrey Laplante, Université de Montréal

  • 14:00-14:30 - Rosemary Mountain, Concordia University (via Skype): Profile of an armchair researcher

This talk will provide a short overview of Rosemary Mountain’s various research interests – in composition, music theory and analysis, teaching, perception & cognition, and multidisciplinary collaborations – and of two books in progress: Conversational Musicology and A Musician's Guide to Time.   The rest of the talk will then explain the basic motivations, structures, and planned objectives for her invention IMP-NESTAR: phase three of the Multimedia Thesaurus and Interactive Multimedia Playroom projects, now developing into a Network of Exploratory Spaces for Temporal Arts Research.  IMP-NESTAR is composed of various physical installations which allow participants to explore sets of sound, video, and still images through handling of tangible objects which can then be placed in a 3-D space in a playful imitation of the psychologists' multi-dimensional scaling experiments. Main objectives are to stimulate discourse about sound and multimedia (e.g. film, dance, performance art) and to focus attention on the diversity of terminology and perception in order to help find common ground for discussion, whether among a group of collaborating artists working in different media or musicologists working in different styles and genres.  The project has potential links with CIRMMT research in music information classification and retrieval, music perception, and music analysis; current links have been focused on the more pragmatic issues of developing an elegant solution to enable the various installations (currently Montreal, Charlottetown, Aveiro, Madrid) to have real-time connections through embedded sensors linking them through a virtual version. 

 

  • 14:30-15:00 - Luis Rodrigues, Concordia University: Systems and control theory methods for articulatory-based networked systems

This talk will give an overview of my research on systems and control theory methods for articulatory-based networked systems. This theory has been applied to articulatory-based models for speech synthesis and recognition. The talk will outline system identification methods, controlability and state transfer energy, and multirate sensor network applications.

 

  • 15:00-15:30 - Ian Sinclair, MPB Technologies: Freedom 6S Haptic Device as an Acoustic Interface

Haptic interface devices are becoming a vital component of virtual reality simulators in many fields.  Such devices permit force and torque feedback to a user, allowing the user to explore a virtual space while holding a stylus or thimble.  The stylus guides a virtual probe while offering the touch-sensation of being in the virtual space.  

Surgical simulation has been the predominant application for such devices, with its focus on the correct "feel" of the surgical instruments.  An emerging application is the simulation of virtual musical instruments.  Virtual reality simulations can have both visual and acoustic elements that allow novel artistic expressions.  As a high fidelity desktop unit, the Freedom 6S has been used in a number of experimental acoustic setups.

We present the development of the Freedom 6S from its prototype in a McGill lab to a commercial unit.  We discuss engineering tradeoffs − stiff yet back-drivable, precise yet forceful enough to give a convincing simulation of a probe moving on a solid surface.  A basic design choice was to offer six degrees of freedom, to permit free motion of the user's hand while holding the stylus.  Example applications in music technology are shown.

Co-author Steve Sinclair, Université Pierre et Marie Curie, Paris, France 

 

  • 15:30-15:45 - BREAK

 

  • 15:45-16:15 - Michael McGuffin, Ecole de technologie superieure (ETS): Of Digits and Eyes: Interaction Techniques and Visualizations for Music
Research into human-computer interaction (HCI) has resulted in many ways to leverage the expressiveness of human fingers for input, and the capabilities of the human visual system to process output.  This talk first presents examples of recent research in HCI and information visualization by my group at ETS, including a novel multitouch radial menu technique.  Next, I will show a zoomable user interface developed by François Cabrol at ETS to help compose musical passages, which has been accepted for publication at CMMR 2013.  Finally, I will argue that most recent visualizations of music have only scratched the surface of what should be possible, and challenge you to think of new ways to "see" music.
 
 
  • 16:15-16:45 - Audrey Laplante, Université de Montréal: On music information seeking behaviour

Information seeking behaviour is an important research area in information science that nevertheless struggles to get attention from researchers in other fields. However, studies on how people interact with information (e.g., how their information needs emerge, how they acquire information, how they use and process information) could be relevant to researchers in many other domains, including in information system design. Results of these studies could help system developers better understand the needs of the users and thus help guide system development. In this talk, I will provide an overview of my research in the area of music information-seeking behaviour. I will more specifically present the results of a qualitative study on how music information is shared within the social networks of adolescents and the role social network sites play in the process. 

 

  • 16:45-17:15 - Discussion