Developing a Database of Immersive Audiovisual Materials

Project Description

It is widely recognised problem amongst audio engineers that there is limited compatibility between three key formats for spatial audio: channel-based, scene-based and object-based. There have been several attempts to introduce formats and rendering methods that allow for the implementation and conversion between these audio formats, however there is yet to be a single, packaged solution for this.

The Alliance for Open Media’s (AOMedia) new Immersive Audio Model and Formats (IAMF) container specification aims to do just this under a royalty-free licence. This research project has supported this work, from the rendering side, to the generation of materials to test and showcase the format. Using a novel combination of virtual production and immersive audio technologies, three audiovisual scenes were created, each for a different context.

This project involved the investigation of a variety of spatial audio workflows and renderers, from Ambisonics to Dolby Atmos. Whilst some of these workflows and rendering methods may be industry standard, it is clear that each have their own limitations. This research has also created a living database of current spatial audio workflows and their limitations.

Some useful links on the IAMF can be found below:

The database of materials produced can be found on Zenodo.

More information on this project can be found on our Spatial Audio Library.