PhD 3:  Real-time Hybrid Acoustic Simulation for Metaverse experiences

Industry Partner: Sony Interactive Entertainment

PhD Supervisor: Damian Murphy; second supervisor tbc.

Description:

Engaging metaverse experiences require significant realism in the acoustics of the virtual environments. Whether creating a plausible virtual reality world or matching real-world acoustics in an augmented reality scene, realistic acoustic rendering is critical to immersion and sense of presence. Modern game engines are starting to provide solutions for geometric acousticmodeling using approximation methods such as image source and ray tracing. Methods based on wave-based acoustic modelling are more physically accurate but have a significant computational overhead and are difficult to achieve for real-time interactive audio. This PhD project looks at developing appropriate strategies for combining techniques in real time, as well as optimisations that can be achieved through perceptual limitations of human spatial hearing.

The project is well suited to a student with strong acoustics knowledge and programming skills, looking to explore acoustic rendering using 3D game engines. The project is supported by Sony Interactive Entertainment, and will include an opportunity to develop the research through a placement at their office in London.

Skills required:

Strong knowledge on acoustics and audio propagation effects. Acquaintance with game development techniques using 3D game engines such as Unity or Unreal.

The successful applicant should have a strong interest in sound, music and immersive audio technology and good programming skills. This project is highly multi-disciplinary in its nature and we welcome applicants from a broad range of core research backgrounds and interests, extending from audio signal processing and machine learning to user experience design, human-computer interaction, as well as relevant creative practice. 

If you have any questions about this project please contact Damian.murphy@york.ac.uk

Apply for this project at PhD in MT

 

Who should apply?

Candidates must have (or expect to obtain) a minimum of a UK upper second-class honours degree (2.1) or equivalent and a strong interest and experience in sound and audio technology. Their formal training and qualifications may be in disciplines not directly associated with audio engineering or metaverse technologies. 

We especially welcome applications from candidates belonging to groups that are currently under-represented in metaverse-related industries; these include (but are not limited to: women, individuals from under-represented ethnicities, members of the LGBTQ+ community, people from low-income backgrounds, and people with physical disabilities.

How to apply

When applying please select ‘CDT in Sound Interactions in the Metaverse’ as your source of funding. You do not need to provide a research proposal, just enter the name of the project you are applying for. 

You can contact us at pet-sim@york.ac.uk

We expect strong competition and encourage those interested to submit applications as early as possible. The deadline for applications is 12:00 noon (GMT) on Monday 20th March 2023.