Skip to main content

Research projects in Information Technology

Displaying 51 - 60 of 101 projects.


Predicting fractures outcomes from clinical Registry data using Artificial Intelligence Supplemented models for Evidence-informed treatment (PRAISE) study

Project description: On behalf of the Victorian Orthopaedic Trauma Outcomes Registry (VOTOR), we will establish the role of artificial intelligence (AI) deep learning to improve the prediction of clinical and longer-term patient-reported outcomes following distal radius (wrist) fractures. The PRAISE study will, for the first time, use a flexible three-stage multimodal deep learning fracture reasoning system to unlock important information from unstructured data sources including X-ray images, surgical and radiology text reports.

Supervisor: Dr Lan Du

Characterising Model Complexity for Data-driven Scientific Discovery

This project aims to explore techniques for characterising the complexity of statistical models. By complexity we refer to the ability of a model to learn patterns, and to potentially generalise to new unseen data. Interest in this are has recently resurged due to the discovery of phenomena such as "double descent", and the use of new model types such as deep neural networks, which challenge traditional notions of complexity.

Supervisor: Dr Daniel Schmidt

Complex question answering & generation over knowledge graphs

Complex questions are those that involve discrete, aggregate operators that operate on numbers (min, max, arithmetic) and sets (intersection, union, difference). Recent advances in complex question answering take a neural-symbolic approach and combine meta-learning and reinforcement learning techniques [1,2,3]. On the other hand, the generation of complex questions, the dual problem, is less explored. Recent works on knowledge graph question generation [4,5] have mainly focussed on multi-hop questions.

Supervisor: Dr Yuan-Fang Li

Logic and Games for Automated Verification

Model checking is an automated formal verification technique in which, given a property F – typically represented as a temporal logic formula – and a model of a system M, one checks whether the system M satisfies property F. Model checking is a well understood formal verification technique supported by several tools, many of which are available online. A little less is known about probabilistic model checking.

Human-Centred Multimodal Teamwork Analytics

This scholarship will provide a stipend allowance of $29,000 AUD per annum for up to 3.5 years, plus $4,000 travel allowance. If you are currently in Australia you are strongly encouraged to apply. If successful, you will join Dr.

ITTC OPTIMA projects

The Australian Research Council (ARC) Training Centre in Optimisation Technologies, Integrated Methodologies, and Applications (OPTIMA) is seeking applications for ten ARC fully-funded PhD projects with generous top-up scholarships.

We're looking for talented students with a background in mathematics, computer science, statistics, economics, engineering or other related fields. These positions are offered across OPTIMA's nodes located at Monash University or The University of Melbourne. Projects will be available from June 2021 onwards.

Supervisor: Prof Peter Stuckey

Precision medicine for paediatric brain cancer patients

The proposed PhD project aims to build a machine learning/deep learning-based decision support system that provides recommendations on precision medicine for paediatric brain cancer patients based on clinical, genomics and functional dependency data (CRISPR, drug screens).

 

Anomaly detection in evolving (dynamic) graphs

Anomaly detection is an important task in data mining. Traditionally most of the anomaly detection algorithms have been designed for ‘static’ datasets, in which all the observations are available at one time. In non-stationary environments on the other hand, the same algorithms cannot be applied as the underlying data distributions change constantly and the same models are not valid. Hence, we need to devise adaptive models that take into account the dynamically changing characteristics of environments and detect anomalies in ‘evolving’ data.

Supervisor: Mahsa Salehi

Clustering of (time series of) generalised dynamic Bayesian nets, etc.

The relationship between the information-theoretic Bayesian minimum message length (MML) principle and the notion of Solomonoff-Kolmogorov complexity from algorithmic information theory (Wallace and Dowe, 1999a) ensures that - at least in principle, given enough search time - MML can infer any underlying computable model in a data-set.

A consequence of this is that we can (e.g.)

Does deep learning over-fit - and, if so, how does it work?

Methods of balancing model complexity with goodness of fit include Akaike's information criterion (AIC), Schwarz's Bayesian information criterion (BIC), minimum description length (MDL) and minimum message length (MML) (Wallace and Boulton, 1968; Wallace and Freeman, 1987; Wallace and Dowe, 1999a; Wallace, 2005).