Who/what gets to steer our attention in an age of ubiquitous hearables?
A 1935 poster made by the Anti-Noise League as featured in Noise Abatement Exhibition: Science Museum. Original image available at:https://journal.sciencemuseum.ac.uk/wp-content/uploads/2020/05/figure-1-1-1536x1028.jpg

Who/what gets to steer our attention in an age of ubiquitous hearables?

Building on Dr. Gershon Dublon's Distinguished Lecture on December 11, this workshop, organized by CIRMMT RA1, aims to bring together researchers in audio, mixed reality and HCI.

Gershon Dublon: copy of a 1935 poster made by the Anti-Noise League as featured in Noise Abatement Exhibition: Science Museum. See https://journal.sciencemuseum.ac.uk/article/sound-technology-and-the-museum/.
Gershon Dublon: A retake on a 1935 poster made by the Anti-Noise League as featured in Noise Abatement Exhibition: Science Museum. See https://journal.sciencemuseum.ac.uk/article/sound-technology-and-the-museum/.

Abstract

Over the last year or so, millions of people have started moving through the world wearing a new class of hearables. They look similar to previous generations, but their internals combine head-tracked, personalized and environmentally adapted binaural content, microphone array beamforming, context-aware adaptive hearthrough or noise cancellation, a voice assistant, and access to user location, biometrics, and aspects of pose. Users can import measured audiograms and EQ their microphone world feed.

While the major challenges are by no means solved, this workshop roundtable takes as its point of departure that all of the above has quite suddenly become ubiquitous, albeit under industry control. It’s a good time to speculate about what this means for related research fields from audiology to augmented reality: where is this going, where can it be positively leveraged, and how should it be steered? Following a series of short presentations from participants working in related areas, Dr Dublon will lead off a roundtable discussion by stepping through sensing opportunities, with a focus on attentional decoding. The planned discussion topics are intended to focus on the importance of sensor-driven automation in mixing and adaptive processing, drawing from background literature on detection of auditory attention via physiological signals including EEG, and the opportunities to shape our soundscapes in a manner that filter out undesirable distractions while preserving the sounds to which we wish to attend.

Registration

We're calling on all the interested members to sign up as participants and/or presenters.

Please provide more info on your possible presentation in the form available here.

Examples and bibliography:

D. J. Strauss, F. I. Corona-Strauss, A. Schroeer, P. Flotho, R. Hannemann, and S. A. Hackley, “Vestigial auriculomotor activity indicates the direction of auditory attention in humans,” eLife, vol. 9, p. e54536, Jul. 2020, doi: 10.7554/eLife.54536.

I. V. Stuldreher, N. Thammasan, J. B. F. Van Erp, and A.-M. Brouwer, “Physiological synchrony in EEG, electrodermal activity and heart rate reflects shared selective auditory attention,” J. Neural Eng., vol. 17, no. 4, p. 046028, Aug. 2020, doi: 10.1088/1741-2552/aba87d.

X. Fan, D. Pearl, R. Howard, L. Shangguan, and T. Thormundsson, “APG: Audioplethysmography for Cardiac Monitoring in Hearables,” in Proceedings of the 29th Annual International Conference on Mobile Computing and Networking, Madrid Spain: ACM, Oct. 2023, pp. 1–15. doi: 10.1145/3570361.3613281.

B. Veluri, M. Itani, J. Chan, T. Yoshioka, and S. Gollakota, “Semantic Hearing: Programming Acoustic Scenes with Binaural Hearables,” in Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, San Francisco CA USA: ACM, Oct. 2023, pp. 1–15. doi: 10.1145/3586183.3606779.

Biography

Gershon Dublon (they/them) is an interaction researcher, electrical engineer, and artist working with sensing and mixed reality to empower human perception. Currently, Dublon is a Senior Researcher in the Advanced Technologies team at Sonos, focusing on applications of multimodal sensing to next-gen audio UX.

Dublon's doctoral work proposed systems and methods to comprehend massive, longitudinal sensor data and AI systems in the service of a sensory connection to self and environment. Dublon has published articles in the journal Presence, Scientific American, Human Factors in Computing Systems (ACM CHI), IEEE Sensors, New Interfaces for Musical Expression (NIME), Body Sensor Networks (BSN), International Conference on Machine Learning (ICML), and others, and recently contributed a chapter to the MIT Press book Swamps and the New Imagination. Dublon’s projects and studio productions have been exhibited in venues and festivals including Boston’s Museum of Fine Arts, Mexico’s National Center for the Arts, Ars Electronica, and the Sundance Film Festival, and covered by the New York Times, Associated Press, BBC, NHK, and others.

In 2018, with artist Xin Liu, Dublon co-founded slow immediate, a creative engineering studio incubated by The New Museum’s NEW INC program and ONX Studio. As the firm's applied researcher, Dublon designed electronic controls for a microgravity robotic system that was launched into space, scent micro-delivery systems, experimental audio-haptic immersive experiences, and more. Dublon is also a board member of Living Observatory, a Boston-based non-profit organization focused on the future of wetland restoration. Dublon received an SM and PhD from the MIT Media Lab, where their research in the Responsive Environments Group was supervised by Prof. Joe Paradiso, and a BSEE from Yale.