The AudioLab once again had a strong presence at the October AES convention in New York. Tomasz Rudzki and Ben Tsui presented work from a number of AudioLab members on a brand new framework for listening tests in virtual reality known as SALTE: Spatial Audio Listening Test Environment. Ben and Tomasz had two technical papers at the convention and also presented a workshop on the framework:
- Building Listening Tests in VR
- SALTE Pt. 1: A Virtual Reality Tool for Streamlined and Standardized Spatial Audio Listening Tests
- SALTE Pt. 2: On the Design of the SALTE Audio Rendering Engine for Spatial Audio Listening Tests in VR
Gavin Kearney also chaired an open panel discussion of the Abbey Road Spatial Audio Forum, to discuss the future of music production in virtual and augmented reality. The panel also consisted of Mirek Stiles (Abbey Road), Stephen Barton (EA Games/Afterlight), Etienne Corteel (L-Acoustics), Oliver Kadel (1.618 Digital) and Hyunkook Lee (University of Huddersfield). The panel discussed existing workflows and tools for immersive music production, emerging trends and challenges and what opportunities for the future.
Gavin was also a member of another panel on ‘Reproduction and Evaluation of Spatial Audio through loudspeakers’. The panel was chaired by Patrick Flanagan (THX) and joining Gavin were Marcos Simon (AudioScenic), Nils Peters (Qualcomm) and Juan Simon Calle Benitez (THX). The panel discussed the challenges of reproducing spatial audio in different loudspeaker reproduction scenarios and how to best evaluate spatial audio quality over loudspeaker systems.