Mixed-Reality Multimedia Retrieval: Exploring User Interaction with Eye Tracking and Gesture Recognition (Bachelor Project, Finished)

Author

Rahel Kempf

Description

With the increasing use of mixed reality (MR) technologies, there is a growing interest in developing intuitive and immersive ways for users to interact with large datasets. One important area of research is how users can effectively explore multimedia data in a spatial environment. Current MR systems often utilize traditional input methods such as controllers or voice commands. Intuitive interfaces like ye-tracking and gesture recognition to interact with multimedia objects could significantly improve the user experience.

This project investigates user interaction techniques for exploring multimedia content within a mixed-reality environment. The results of a predefined query will be visually presented on a 3D virtual globe, allowing users to interact with the data through zooming, panning, and further exploration. The main objective is to study how users can best interact with the results using eyetracking and gesture-based  controls without the challenge of creating a query in the mixed reality environment. This project seeks to identify and evaluate natural and efficient ways for users to navigate and explore large multimedia datasets.

Start / End Dates

2024/09/30 - 2024/11/25

Supervisors

Research Topics