AI acoustic response extrapolation

AI approach to acoustic response extrapolation for Bonza Music immersive rendering

Project Profile

Project Lead

Dr Michael McLoughlin (Academic Lead)
Fiona Ryder (Industry Lead)

Associated People

University of York
Dr Karolina Prawda
Professor Gavin Kearney
Professor Helena Daffern

Bonza Music Ltd
Dr Alex Carot
Ben Heritage

Research Theme

AI and Creative Industries

Project Funder

Innovate UK

Project Partners

Bonza Music Ltd

Project Description

Immersion in virtual and augmented reality solutions is reliant on plausible spatial audio. However, performing acoustic measurements often requires many individual measurements of source-receiver pairs with specialist spatial microphones, making the procedure time-consuming and expensive. In this project, we explored a method utilising machine learning and signal processing techniques to gather a set of spatially extrapolated ambisonic measurements from a single mono Room Impulse Response (RIR). We evaluated the effectiveness of our method in providing perceptually plausible extrapolated RIRs using 3-Alternative Forced Choice (3AFC) listening test.