MoCap Data Exchange Workshop

The MoCap Data Exchange and the Establishment of a preliminary Database of Music Performances (MoCap and Video) workshop is the second workshop organized within CIRMMT research axis 2 (Musical Gestures, Devices and Motion Capture). It will take place during the Society for Music Perception and Cognition 2007 conference in Montreal.

To register, send an email to mocap_workshop at cirmmt.mcgill.ca


For additional information, please contact:

Isabelle Cossette (isabelle.cossette1 at mcgill.ca)
Marcelo Wanderley (marcelo.wanderley at mcgill.ca)

Maximum 30 participants. First registered will have priority.

Workshop Description

MoCapMotion Capture (MoCap) systems have traditionally been used in motor control research (e.g. gait and rehabilitation) and in computer animation (e.g. video games and movies). Thanks to substantial funding from different agencies and also to lower technology costs, motion capture systems are becoming increasingly available in music performance research laboratories.

Considering McGill University alone, around a dozen research laboratories use motion capture systems (at least 3 in a musical context). Ideally, data acquired by one system should be easily readable by software from another manufacturer (or by specific Matlab routines). Although theoretically file formats like c3d allow for such exchange, in practice this does not usually work because of software (and also hardware) implementation differences. Many times systems cannot open ancillary data (analog

channels, video data) or actually loose information on marker labels and/or body models.

The consequence is that researchers typically capture and analyze their own data with their own systems, therefore research comparisons and the reproduction of results cannot be easily done. Although we know that each scientific experiment is unique, exchange of data among laboratories could at the very least provide a benchmark of analysis tools. At best, it will allow for the comparison of similar experiments and the development of research collaborations across laboratories.

For this workshop, we will ask potential participants to exchange sets of music performance data obtained with different systems before the event. We will verify which exchanges are directly feasible (if any), and identify the bottlenecks in those that are not obvious. The systems we currently use for measurements of full body, hands and chest wall displacements include: Vicon System 460, Vicon MX, BTS Smart, NDI Optotrak, NDI Certus, Phoenix VisualEyez (and potentially PhaseSpace).

A second point – given that data will be available for the workshop from various researchers – is to discuss what would be a *basic* methodology for motion capture that could be useful for various researchers (unfortunately, not to all, though). This includes the number and position of markers (e.g. using Vicon’s Plug In Gait, so that the body’s center of mass can be calculated directly from the markers without the need for force plates?), the position and requirements of video camera(s) (e.g. background shot without the performer for background subtraction during analysis), the requirements of performer clothes (e.g. lycra suits to avoid marker movement), etc.

Though this workshop might not produce scientific data immediately, this is the first step in making possible multidisciplinary and multi-level body measurements collaborations between the various laboratories working with MoCap which should result in an increase of productivity.

This is the second workshop organized within CIRMMT research axis "Musical Gestures, Devices and Motion Capture." It follows the Workshop on Motion Capture for Music Performance, held at McGill University on October 30-31, 2006.

For more information on the SMPC conference: http://alcor.concordia.ca/~smpc2007/