Enhancing Multimedia Retrieval with Emotion-Aware Augmented Reality Interfaces (Bachelor Thesis, Ongoing)
Author
Description
The rapid evolution of multimedia systems has shifted user expectations towards more intuitive and personalized experiences. Traditional multimedia retrieval systems focus primarily on explicit user inputs, such as keywords or tags, often failing to capture the deeper nuances of human interaction, such as emotions. Emotions are central to how users interact with and interpret digital content. Integrating emotion recognition into multimedia retrieval allows systems to transcend traditional interaction methods, enabling more context-aware and engaging user experiences.
This thesis seeks to bridge the gap between user emotions and multimedia systems by incorporating real-time emotion recognition into an augmented reality (AR) application. This emotion-aware system will not only adapt retrieval results dynamically based on the user's emotional state. However, it will also enable the use of emotions as direct inputs for searching emotionally charged multimedia content. The end goal is to deliver a more natural and immersive user experience.
Emotion recognition technologies have seen significant advancements in recent years, particularly in their ability to analyze facial expressions, voice tones, and physiological signals. Among the leading tools in this domain is Affectiva's Emotion SDK, which offers robust facial emotion detection capabilities. Using a device's front-facing camera, Affectiva's SDK can detect and classify emotions such as joy, sadness, anger, and surprise in real-time. In addition to Affectiva, the project will remain flexible in incorporating alternative solutions if they offer comparable or superior functionalities during development.
Start / End Dates
2025/01/27 - 2025/05/26