Image Retrieval at Memory’s Edge:Known Image Search based on User-Drawn Sketches
Michael Springmann, Ihab Al Kabary, Heiko Schuldt
Proceedings of the 19th ACM International Conference on Information and Knowledge Management (CIKM 2
Sheridan Printing, Alpha, New Jersey
With the increasingly growing size of digital image collections, known image search is gaining more and more importance. Especially in collections where individual objects are not tagged with metadata describing their content, content-based image retrieval (CBIR) is a promising approach, but usually suffers from the unavailability of query images that are good enough to express the user's information need. In this paper, we present a system that provides CBIR based on user-drawn sketches. The system combines angular radial partitioning for the extraction of features in the user-provided sketch, taking into account the spatial distribution of edges, and the image distortion model. This combination offers several highly relevant invariances that allow the query sketch to slightly deviate from the searched image in terms of rotation, translation, relative size, and/or unknown objects in the background. To illustrate the benefits of the approach, we present search results from the evaluation of our system on the basis of the MIRFLICKR collection with 25,000 objects and compare the retrieval results of pure metadata-driven approaches, pure content-based retrieval using different sketches, and combinations thereof.