Skip to main content

XR-OR: Extended Reality Analytics for Smart Operating Rooms and Augmented Surgery

Primary supervisor

Tim Dwyer

Research area

Embodied Visualisation

We seek to explore opportunities and challenges for the use of Extended Reality (XR) technologies (including augmented and virtual reality, as well as mixed-reality interaction techniques) to support surgeons, operating room technicians, and other professionals in and around operating room activities. Particular areas that may be explored are:

  • Immersive OR analytics: using XR to analyse data from various operating room sources. We will focus on techniques that will make such data available through hands-free interaction such that data analysis can be performed even during operations. Will also explore the role of AI to guide or suggest analytics.
  • Patient Data Analysis, Procedure Planning & Organ Segmentation: we will explore immersive XR interaction techniques for patient data segmentation and surgery planning.
  • Hybrid User Interfaces and Collaboration: we will investigate XR support for collaboration across locations & devices, with a focus on surgical interventions, i.e. intra-operative but also for surgery planning. Use cases are remote mentoring, intra-operative team collaboration or training.
  • Service and Maintenance: XR support of hospital/surgery theatre technicians in service, maintenance and repair tasks. Again, AI can be integrated to support and guide workers, for example, to support predictive maintenance and task planning.

Required knowledge

Stipend funding is available for domestic students only (Australian or New Zealand Citizens or Permanent Residents)

Applicants should have strong programming and computer science backgrounds, ideally experience developing interactive and immersive systems.

Project funding

Other

Learn more about minimum entry requirements.