This project aims to identify novel methods for inferring actors, activities, and other elements from short message communications. Covert communications are a specialist domain for analysis in the Law Enforcement (LE) context. In this project we aim to improve law enforcement’s understanding of online criminal communications, exploring texts for automated understanding of intent, sentiment, criminal capability, and involvement.
Research projects in Information Technology
Displaying 61 - 70 of 112 projects.
Explainability of AI techniques in law enforcement and the judiciary
This project will investigate and develop the ways in which AI algorithms and practices can be made transparent and explainable for use in law enforcement and judicial applications
Ethics of AI application in law enforcement
The use of AI in law enforcement and judicial domains requires consideration of a number of ethical issues. This project will investigate and develop frameworks that embed ethical principles in the research, development, deployment ,and use of AI systems in law enforcement (LE). A major focus is expected to concern the acquisition, use, sharing and governance of data for AI in this context.
Predicting fractures outcomes from clinical Registry data using Artificial Intelligence Supplemented models for Evidence-informed treatment (PRAISE) study
Project description: On behalf of the Victorian Orthopaedic Trauma Outcomes Registry (VOTOR), we will establish the role of artificial intelligence (AI) deep learning to improve the prediction of clinical and longer-term patient-reported outcomes following distal radius (wrist) fractures. The PRAISE study will, for the first time, use a flexible three-stage multimodal deep learning fracture reasoning system to unlock important information from unstructured data sources including X-ray images, surgical and radiology text reports.
Complex question answering & generation over knowledge graphs
Complex questions are those that involve discrete, aggregate operators that operate on numbers (min, max, arithmetic) and sets (intersection, union, difference). Recent advances in complex question answering take a neural-symbolic approach and combine meta-learning and reinforcement learning techniques [1,2,3]. On the other hand, the generation of complex questions, the dual problem, is less explored. Recent works on knowledge graph question generation [4,5] have mainly focussed on multi-hop questions.
Human-Centred Multimodal Teamwork Analytics
This scholarship will provide a stipend allowance of $29,000 AUD per annum for up to 3.5 years, plus $4,000 travel allowance. If you are currently in Australia you are strongly encouraged to apply. If successful, you will join Dr.
ITTC OPTIMA projects
The Australian Research Council (ARC) Training Centre in Optimisation Technologies, Integrated Methodologies, and Applications (OPTIMA) is seeking applications for ten ARC fully-funded PhD projects with generous top-up scholarships.
We're looking for talented students with a background in mathematics, computer science, statistics, economics, engineering or other related fields. These positions are offered across OPTIMA's nodes located at Monash University or The University of Melbourne. Projects will be available from June 2021 onwards.
Precision medicine for paediatric brain cancer patients
The proposed PhD project aims to build a machine learning/deep learning-based decision support system that provides recommendations on precision medicine for paediatric brain cancer patients based on clinical, genomics and functional dependency data (CRISPR, drug screens).
Anomaly detection in evolving (dynamic) graphs
Anomaly detection is an important task in data mining. Traditionally most of the anomaly detection algorithms have been designed for ‘static’ datasets, in which all the observations are available at one time. In non-stationary environments on the other hand, the same algorithms cannot be applied as the underlying data distributions change constantly and the same models are not valid. Hence, we need to devise adaptive models that take into account the dynamically changing characteristics of environments and detect anomalies in ‘evolving’ data.