Skip to main content

Force Extended Gesture Interaction for Augmented Reality

Primary supervisor

Barrett Ens

Co-supervisors

  • Gun Lee (UNISA)

Interaction with Augmented Reality and Virtual Reality depends on the development of new, natural input methods such as voice, eye gaze, and hand gestures. Hand gestures are useful for typical interactions such as pointing and manipulating objects, but made even more powerful using electromyography (EMG) sensors to ‘extend’ them with an extra dimension of muscle force. For instance, clenching a hand while using a virtual paint brush can be mapped to the thickness or colour of the paint, making input both richer and easier.

Aim/outline

This project aims to develop novel force extended gesture techniques for interaction in AR and VR. The student will develop these techniques using EMG sensors, hand tracking sensors such as Leap Motion, and AR or VR displays such as the HoloLens or HTC Vive. The novel techniques will be evaluated in user studies and demonstrated in prototype applications.

URLs/references

Required knowledge

Skills in C#, or in another Object Programming language. Experience in Unity is a plus.