Stephen Brewster, University of Glasgow, UK: "Using musical structures to create multimodal interfaces"

ABSTRACT:
In this lecture I show how we use simple musical structures to design auditory and tactile feedback, and how these can be used to create new multimodal user interfaces to improve the usability of future mobile devices. Mobile phones are ubiquitous but they have limitations; their small screens and keyboards make them difficult to use. Screens, for example, cannot easily be made bigger as the phone has to be small enough to fit into a pocket or bag, so other modalities are needed to increase the display space and allow ‘eye-free’ use.

At Glasgow we have worked on Earcons, or structured non-speech sounds, for many years. These are designed using basic concepts from music (manipulations of timbre, rhythm, pitch, tempo and 3D location) and can be used to create sounds to represent objects, actions or hierarchical structures for user interfaces. We have extended the work on Earcons to create Tactons, or tactile icons. We based our design for these on similar structures to Earcons (manipulating rhythm, waveform and body location). These can be again be used in user interfaces and I will show how learning feedback in one modality can be transferred to another.

I will show have we have used Earcons to represent hierarchical menus in phones, used 3D sound for ‘audio windows’ and other audio ‘widgets’. We have also used Tactons for representing alerts and alarms, for conveying feedback about buttons on touchscreens (where there are no physical buttons), and as non-visual feedback to accompany gestures.

ABOUT STEPHEN BREWSTER:

Stephen Brewster has been  a professor of human-computer interaction in the department of computing science at the University of Glasgow since 2001. He is currently an EPSRC Advanced Research Fellow and studies the use of multimodal interactions for a range of different application, focusing on novel interfaces for mobile devices..

Brewster's research focuses on multimodal human computer interaction, or using multiple sensory modalities (particularly hearing, touch and smell) to create richer interactions between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. He has shown that novel use of multimodality can significantly improve usability in a wide range of situations, for mobile users, visually-impaired people, older users and in medical applications.

More information is available at www.dcs.gla.ac.uk/~stephen