Skip to main content

Creating a turnkey solution to classify, predict and simulate behaviour from videos of rodents

Primary supervisor

David Dowe


  • Andrew Lawrence


Rodent behavioural testing is the study of the neural mechanisms underlying emotions [1].  It is used in the study of almost all mental conditions, including PTSD [2], OCD [3] and autism [4].  For example, to measure anxiety, researchers may place a rodent in a large tub, record a top-down video and measure the time spent near the safety of walls [2]. These videos also contain rich information about behavioural patterns, but scoring this manually is time consuming. For this reason, machine learning solutions have been developed to automate behavioural prediction [5-12]. DeepLabCut [5] is a tool that predicts body part location data and B-SOiD [6] is a tool that predicts behaviours based on this data over time. However, DeepLabCut needs to be trained by annotating rodent body part locations in hundreds of example video frames. Coding expertise is also needed. These barriers mean that automated behavioural prediction tools are accessible to only a fraction of behavioural research labs. This project aims to enable the widespread adoption of such tools by:


  1. Creating a pre-trained model that can detect rodent body part locations in videos of common behavioural tests, without requiring manual annotation.
  2. The student will also apply existing tools 5, 6 to identify behavioural patterns in rodents linked to cravings for alcohol and mother-infant interactions following specific neuronal modifications.


To make this pre-trained model, the student will script a virtual mouse model 13 to traverse through common behavioural apparatuses within a realistic simulation tool called Unreal Engine 14 . This would provide thousands of diverse example images with corresponding body part locations. These data would be used to train a deep learning model 5, 7 . The model’s high-quality body part predictions may allow unsupervised behavioural prediction tools 3, 6, 9 to be applied seamlessly. The student will aim to create a Google Colab 15 notebook, which would connect rodent videos to behavioural predictions. Additionally, the student will apply DeepLabCut 5 and B-SOiD 6 to existing datasets in relation to Aim 2.

Expected outcomes

  1. Create a reliable pretrained model for detecting rodent body part locations in behavioural apparatuses.
  2. Make an analysis pipeline which connects raw videos to behavioural predictions in Google Colab 15 . This may eliminate the need for coding expertise.
  3. If there is time, generalise this work to side-on and bottom-up camera angles or to multiple rodents.
  4. Gain insights into the behavioural predictors of alcohol cravings and rodent mother-infant interactions.


This pipeline may make automated behavioural prediction a routine and simple process for all behavioural researchers. This would enhance the research into almost all mental health disorders. Greater understanding of the neurons responsible for specific behaviours in these disorders may eventually lead to the development of better pharmacological therapies, which target these neurons. Finally, the student may be able to gain insights into the neuronal basis for addiction and maternal care.


Learn more about minimum entry requirements.