In recent years, social media have become a common plattforms for criminals to stalk, intimidate, manipulate and abuse vulnerable citizens, such as women and youth. A recent survey of students in grades 6 to 9 found that the rates of electronic bullying for girls were between 16% and 19%, whereas the rates for boys were between 11% and 19%. 33.47% of sexually abused girls reported experiencing cyberbullying compared to 17.75% of nonsexually abused girls.
Research projects in Information Technology
Displaying 61 - 70 of 188 projects.
Privacy-Enhancing Technologies for the Social Good
Privacy-Enhancing Technologies (PETs) are a set of cryptographic tools that allow information processing in a privacy-respecting manner. As an example, imagine we have a user, say Alice, who wants to get a service from a service provider, say SerPro. To provide the service, SerPro requests Alice's private information such as a copy of her passport to validate her identity. In a traditional setting, Alice has no choice but to give away her highly sensitive information.
Medical AI Exploration with Machine Learning and Robotics
PhD Studentship Description:
Disentangled Representation Learning for Synthetic Data Generation and Privacy Protection
Synthetic data generation has drawn growing attention due to the lack of training data in many application domains. It is useful for privacy-concerned applications, e.g. digital health applications based on electronic medical records. It is also attractive for novel applications, e.g. multimodal applications in meta-verse, which have little data for training and evaluation. This project focuses on synthetic data generation for audio and the corresponding multimodal applications, such as mental health chatbots and digital assistants for negotiations.
[NextGen] Secure and Privacy-Enhancing Federated Learning: Algorithms, Framework, and Applications to NLP and Medical AI
Federated learning (FL) is an emerging machine learning paradium to enable distributed clients (e.g., mobile devices) to jointly train a machine learning model without pooling their raw data into a centralised server. Because data never leaves from user clients, FL systematically mitigates privacy risks from centralised machine learning and naturally comply with rigorous data privacy regulations, such as GDPR and Privacy Act 1988.
Explainable AI (XAI) in Medical Imaging
Are you interested in applying your AI/DL knowledge to the medical domain?
Development of AI based Point of Care MRI
Portable point of care medical devices have revolutionised the way in which people receive medical treatment. It can bring timely and adequate care to people in need but also opens up the opportunity to address the healthcare inequality for the rural and remote.
Machine Learning for faster and safer MRI and PET imaging
Machine learning has recently made significant progress for medical imaging applications including image segmentation, enhancement, and reconstruction.
Funded as an Australian Research Council Discovery Project, this research aims to develop highly novel physics-informed deep learning methods for Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) and applications in image reconstruction and data analysis.
Guarding On-device Machine Learning Models via Privacy-enhancing Techniques
On-device machine learning (ML) is rapidly gaining popularity on mobile devices. Mobile developers can use on-device ML to enable ML features at users’ mobile devices, such as face recognition, augmented virtual reality, voice assistance, and medical diagnosis. This new paradigm is further accelerated by AI chips and ASICs embedded on mobile devices, e.g., Apple’s Bionic neural engine. Compared to cloud-based machine learning services, on-device ML is privacy-friendly, of low latency, and can work offline.
Privacy-Aware Rewriting
Despite the popularity of providing text analysis as a service by high-tech companies, it is still challenging to develop and deploy NLP applications involving sensitive and demographic information, especially when the information is expected to be shared with transparency and legislative compliance. Differential privacy (DP) is widely applied to protect privacy of individuals by achieving an attractive trade-off between utility of information and confidentiality.