Beyond Queries: Emotion-Aware Content Exploration from User Interaction (Master Project, Ongoing)
Author
Description
Traditional multimedia retrieval systems rely almost exclusively on explicit queries to determine what content a user wants. While effective for clear searches, this approach fails to capture the implicit or subconscious signals users show while exploring multimedia. In these cases, users may not know what they want beforehand or may struggle to express their intent in words.
Recent advances in affective computing and interaction tracking enable estimation of user emotion, attention, and engagement from signals such as gaze, facial expressions, and interaction behaviour. These signals offer a rich yet untapped view of user preferences. Integrating them into multimedia retrieval enables query-free exploration, in which the system adapts content presentation based on user reactions rather than explicit requests.
This master's project investigates this by designing and evaluating an emotion-aware content exploration system. Instead of only relying on typed queries, the system will track where the user views, interacts with, or emotionally responds to within an image or video collection. These signals are then used to infer interest and refine retrieval results through query expansion and re-ranking. The work will use an established multimedia collection, such as V3C, or an image dataset. Emotion and interaction signals will be captured from the user, not from the collection itself.
Start / End Dates
2026/02/16 - 2026/06/16