Context-Aware Result Presentation for Multimedia Retrieval in Mixed Reality (Master Project, Ongoing)

Author

Lars Schneider

Description

Mixed Reality (MR) systems increasingly integrate computer vision and information retrieval to enrich the physical environment with dynamic and context-sensitive digital information. While prior work (MR. Intenso and (MR)2) focused on detecting objects in MR and retrieving related information, the challenge of meaningfully presenting such retrieval results within a spatial, interactive XR environment remains largely unsolved.

Conventional approaches often present results as flat, screen-like overlays, which can clutter the scene, obscure objects, or detach the information from its physical context. As XR devices such as the Apple Vision Pro become more common, new interaction and presentation paradigms are required to ensure that retrieved information is intuitive, spatially coherent, and non-intrusive.

This master project addresses this challenge by designing and evaluating novel approaches for embedding retrieval results directly into an XR scene. Instead of implementing an end-to-end retrieval pipeline, the work builds on an existing applications (MR.Intenso and MediaMix). It focuses on exploring and developing new spatial UI/UX strategies for presenting results. Objects will be detected in MR, alongside textual search, but the primary contribution lies in how the system presents information in a context-aware, user-friendly manner.

Start / End Dates

2026/02/16 - 2026/06/01

Supervisors

Research Topics