Apply for a SIM PhD
Fully-Funded PhD Studentships in Sound Interactions in the Metaverse (SIM)
Join a dynamic team of researchers exploring audio for extended reality technologies and the exciting impact it can have on our world. Three PhD studentships, starting September 2024 based at the University of York covering home fees and a stipend for 3.5 years
SIM is an exciting opportunity to study in a collegiate research environment with world-leading academics, companies and third sector organisations conducting challenge-led research around sound related metaverse technologies. We are building a diverse group of motivated candidates interested in building cross-disciplinary skills and expertise linking levels of research across technical development and testing, content design, user experience, and evaluation of broader impact on society.
Why apply to SIM
The University of York is committed to conducting research that positively impacts people’s lives, and providing equality of opportunity. York is currently holder of the Athena Swan Bronze Award for gender equality in higher education.
The SIM cohort will grow in 2024 from five PhD students who enrolled in 2023, and will grow to nine. Students receive training in research methods, recording techniques for XR, immersive audio systems, AI and responsible innovation and co-design. You will also develop a personal training plan with your supervisors and industry partner accessing existing taught modules across the University. There will be opportunities to work directly with industry and third sector organisations through placements, research exchanges and knowledge exchange events.
The three available projects are listed below:
Stories of the Stones: Accessible heritage experiences through augmented audio reality
External Partner: York Museums Trust
Academic Supervisors: Jude Brereton, Anna Bramwell-Dicks, School of Arts and Creative Technologies
Wind Power and Psychoacoustics in the Metaverse
External Partner: AECOM Acoustics
Academic Supervisor: Frank Stevens, Ümit Cali School of Physics Engineering, and Technology
Automatic Upmixing of Legacy Audio Content for Metaverse Applications Using a Machine Learning Approach
External Partner: BBC R&D
Academic Supervisor: Damian Murphy AudioLab, School of PET
Who should apply?
Candidates must have (or expect to obtain) a minimum of a UK upper second-class honours degree (2.1) or equivalent and a strong interest and experience in sound and audio technology. Their formal training and qualifications may be in disciplines not directly associated with audio engineering or metaverse technologies.
We especially welcome applications from candidates belonging to groups that are currently under-represented in metaverse-related industries; these include (but are not limited to): women, individuals from under-represented ethnicities, members of the LGBTQ+ community, people from low-income backgrounds, and people with physical disabilities.
How to apply
When applying please select ‘CDT in Sound Interactions in the Metaverse’ as your source of funding. You do not need to provide a research proposal, just enter the name of the project you are applying for.
Please visit the webpage of the project you are interested in for full details and how to apply. You can apply for more than one project. You can contact us at firstname.lastname@example.org
We expect strong competition and encourage those interested to submit applications as early as possible. The deadline for applications is Monday 29th January 2024.
In addition to the data sharing outlined in the University’s Privacy Notice - student applicants, applications will be shared with the external partner for the shortlisting process and interview panels. The external partner will delete details of unsuccessful applications once the selection process is complete.
Register your interest for this PhD
The university will respond to you directly. You will have a FindAPhD account to view your sent enquiries and receive email alerts with new PhD opportunities and guidance to help you choose the right programme.