Outils personnels

The Science and Technology of Music

Sections

Aller au contenu. | Aller à la navigation

Workshop on SIMSSA XII

— Mots-clés associés :

This workshop is co-organized by CIRMMT Research Axis 2 (Music information research) and SIMSSA. This workshop is free and open to all. Registration is required.

Quoi ?
  • Research Workshop
Quand ? 07/08/2017
du 09:00 au 14:30
Où ? A832, Elizabeth Wirth Music Building, 527 Sherbrooke St. West
Ajouter un événement au calendrier vCal
iCal

Information on the SIMSSA project can be found here: https://simssa.ca

Registration

Space is limited; register now to ensure your seat!

To register for this workshop: Registration - Workshop on SIMSSA XII

Description

Featuring presentations on recent work in the SIMSSA Project, including Pixel.js, the Interactive Classifier, Neon.js, and our latest OMR developments. Craig Sapp is our featured Keynote Speaker.

Keynote Bio

Craig is a researcher at the Packard Humanities Institute/Center for Computer Assisted Research in Humanities, as well as an adjunct professor at Stanford University. He has a Ph.D. in Computer-based music and acoustics from Stanford, a Master's in composition and piano performance as well as an undergraduate degree in music and physics from the University of Virginia. He has taught at Peabody Conservatory in Baltimore, Maryland, and worked on computational performance analysis of Chopin's mazurkas at Royal Holloway, University of London with Nicholas Cook. Currently he is working on a digital edition of Chopin's early prints at the Fryderyk Chopin Insitute in Warsaw, Poland. He is also technical director of the Josquin Research Project (http://josquin.stanford.edu), and the Tasso in Music Project (http://www.tassomusic.org).

Abstract

Verovio Humdrum Viewer (VHV, http://verovio.humdrum.org) is a web interface that integrates verovio (http://www.verovio.org) and humlib (http://humlib.humdrum.org) into an online music notation editor and music-analysis interface. Music can currently be imported into VHV as Humdrum or MusicXML data, with MEI data import planned in the near future. The VHV interface allows for both textual and graphical editing of the music, as well as a system for applying analysis filters to the digital scores. VHV is currently being used for displaying music for the Tasso in Music Project (http://www.tassomusic.org) and for several online analysis tools in the Josquin Research Project (http://josquin.stanford.edu), as well as being used for a planned critical edition of early Chopin music prints at The Fryderyk Chopin Institute in Warsaw, Poland. Recent work this summer on VHV has been to integrate real-time data validation, both in terms of basic data structure as well as the rhythmic content, and syntax highlighting. These features allow for faster editing and proof reading of the music.

Slides are available here: http://bit.ly/simssa-xii-vhv

Guests

  • Claire Arthur, McGill University
  • Alex Daigle, McGill University
  • Ichiro Fujinaga, CIRMMT, McGill University
  • David Garfinkle, McGill University
  • Andrew Hankinson, Bodleian Libraries, University of Oxford
  • Yaolong Ju, CIRMMT, McGill University
  • Cory McKay, CIRMMT, Marianapolis College
  • Zoé McLennan, McGill University
  • Alex Morgan, L'université libre de Bruxelles
  • Sacha Perry-Fagant, McGill University
  • Zeyad Saleh, McGill University
  • Craig Sapp, CCARH, Stanford University
  • Martha Thomae, CIRMMT, McGill University
  • Andrew Tran, McGill University
  • Gabriel Vigliensoni, CIRMMT, McGill University
  • Jorge Calvo Zaragoza, McGill University
  • Ké Zhang, McGill University

Schedule

9:00-10:30

  • 9:00-9:20, Ichiro Fujinaga: Introduction
  • 9:20-9:50, Craig Sapp, Keynote Speaker: Verovio Humdrum Viewer: online music notation rendering and analysis
  • 9:50-10:10, Andrew Hankinson: Building the new DIAMM: Linking and sharing data for medieval musicology
  • 10:10-10:30, Alex Morgan: Cross-Platform Analysis: Combining Music-Analysis Programs

 

10:30-10:50 Coffee Break

 

10:50-12:10

  • 10:50-11:02, Gabriel Vigliensoni: Infrastructure for Human-Aided Optical Music Recognition
  • 11:02-11:22, Jorge Calvo Zaragoza: Pixelwise Classification for Music Document Analysis
  • 11:22-11:34, Ké Zhang & Zeyad Saleh: Pixel.js: Web-based Pixel Classification Correction Platform for Ground Truth Creation
  • 11:34-11:46, Sacha Perry-Fagant & Alex Daigle: Interactive Classification of Connected Components in Web Applications Using the Gamera Framework
  • 11:46-11:58, Zoé McLennan & Andrew Tran: Neon.js v1.0 - Neume Editor ONline
  • 11:58-12:10, David Garfinkle: PatternFinder: Content-Based Music Retrieval with music21

 

12:10-13:10 Lunch

 

13:10-14:30

  • 13:10-13:22, Yaolong Ju: Non-chord tone identification
  • 13:22-13:42, Claire Arthur: Renaissance Counterpoint in Theory and Practise: A Case Study
  • 13:42-14:02, Cory McKay: Using Statistical Feature Extraction and Machine Learning
    in Musicological Research
  • 14:02-14:14, Martha Thomae: Automatic Scoring-up Tool for Mensural Music

 

Round Table & Final Discussion


SIMSSA logos 

 

 

Actions sur le document