City-Stories: A Spatio-Temporal Mobile Multimedia Search System

Authors
Lukas Beck and Heiko Schuldt
Type
In Proceedings
Date
2016/12
Appears in
Proceedings of the 2016 IEEE International Symposium on Multimedia (ISM 2016)
Location
San Jose, CA, USA
Publisher
IEEE
Abstract

Mobile devices have become ubiquitous in the last years. With their powerful built-in cameras and sensors, they are intensively used as devices for capturing multimedia content. However, at the same time, the support of these mobile devices for searching within multimedia collections is still quite limited. These limitations are particularly obvious in the search paradigm(s) that are usually supported on mobile devices which mainly consist of traditional keyword search; if, at all, they might at most also support query-by-example content-based queries. In this paper, we present City-Stories, a powerful application that helps users to capture multimedia content and that in particular provides a rich set of different search paradigms. The search paradigms of City-Stories include temporal search (when a photo was taken), spatial search (where a photo was taken), and content-based similarity. For the latter, query-by-example (QbE) and query-by-sketch (QbS) are available. All these search paradigms can be freely mixed, either simultaneously (e.g., search for spatial and content similarity), or sequentially (e.g., to refine the query results obtained from one query mode with another one). In this paper, we introduce the City-Stories system and present the results of an evaluation which shows the gain in effectiveness when combining different search paradigms on a mobile device in comparison to using these paradigms individually.