Primary supervisor
Leimin TianCo-supervisors
During a dialogue, people express emotions to share their internal thoughts with their partner, while perceiving emotions expressed by their partner. How the emotional dynamics unfold in a dialogue conveys rich information, such as the social relationship between the dyad, hot spots during the conversation, and outcomes of the session. This project aims at capturing the temporal and interpersonal emotional dynamics in dialogues by modelling existing datasets of emotional dialogues. Such models can be incorporated in a conversational agent to increase its performance and its social intelligence.
Student cohort
Double Semester
Aim/outline
Basic goals:
- Modelling temporal emotional dynamics in an existing dialogue dataset
- Modelling interpersonal emotional dynamics in an existing dialogue dataset
- Modelling relationships between emotional dynamics and other factors, such as session outcomes, speaker relations, and dialogue hot spots
Possible extensions:
- Conducting cross-corpora analysis to understand the generalizability of identified emotional dynamics
- Applying the identified emotional dynamics in a conversational agent
- Performing user studies to evaluate the gain of incorporating emotions in a conversational agent
URLs/references
- The IEMOCAP dataset: https://sail.usc.edu/iemocap/
- Ma, Y., Nguyen, K.L., Xing, F.Z. and Cambria, E., 2020. A survey on empathetic dialogue systems. Information Fusion, 64, pp.50-70.
- Oh, K.J., Lee, D., Ko, B. and Choi, H.J., 2017, May. A chatbot for psychiatric counseling in mental healthcare service based on emotional dialogue analysis and sentence generation. In 2017 18th IEEE international conference on mobile data management (MDM) (pp. 371-375). IEEE.
- Schuller, B.W., 2018. Speech emotion recognition: Two decades in a nutshell, benchmarks, and ongoing trends. Communications of the ACM, 61(5), pp.90-99.
Required knowledge
Python programming, machine learning and/or deep learning