Skip to main content

Modeling User Fatigue of Gesture Interaction

Primary supervisor

Barrett Ens

Mobile computing will soon move beyond the smartphone, with technology rapidly advancing in Augmented Reality, Virtual Reality and the Internet of Things. Interaction with these technologies will involve the development of natural input methods such as voice, eye gaze, and hand gestures. Gesture interaction has received recent attention due to the proliferation of sensing technologies such as the Wii Remote, Kinect, and Leap Motion, which has revealed limitations such as fatigue due to holding the arms up for extended periods. However, as sensing technologies improve, there is potential to reduce fatigue through smaller, more precise, motions known as ‘microgestures’.


This project aims to develop an improved model of user fatigue for gesture interaction for microgestures. A previous model known as Consumed Endurance, models arm fatigue as a function of arm angle and time, but does not take into account motions of the elbow, wrist or fingers. The new model will be developed using electromyography (EMG) sensors to measure muscle fatigue of human participants. These measurements will be used to first evaluate the existing consumed endurance model and then to improve the model to include various muscle groups.

Required knowledge

Skills in C#, or in another Object Programming language.