PAD-IR: Paper-Digital System for Information Capture and Retrieval (Finished)
The PAD-IR project will extend the notion of paper-digital retrieval systems beyond that of a paper-based interface to an information retrieval (IR) system in order to truly bridge the paper-digital divide by allowing retrieval across different forms of media, including handwritten notes and sketches. Thus, it will not simply be a case of formulating queries on paper, but also being able to digitally capture information on paper and link it to various forms of digital media based on the semantics of that information and also the context in which it was captured. Objects need to be managed together with their metadata including links between objects, the context of their acquisition and content features. Retrieval may then be based on queries specified digitally or on paper, or even some combination of both. Since queries might encompass several media types and the additional object meta data, dedicated algorithms to effectively search in these media types have to be available as basic building blocks. Query processing will consist of the automatic, individual composition of the necessary building blocks, in a way which is completely transparent to the user. The application settings that PAD-IR will explore will include various kinds of meeting scenarios as well as post-meeting retrieval of information. During meetings, users often work with several paper and digital documents including handwritten notes taken by individual participants, sketches used as part of collaborative design processes and presentation tools. Although there are existing tools to help record meeting sessions, they tend to focus either on digital recordings such as a combination of audio, video and presentations or solely on the recording of handwritten notes synchronised with audio recordings. Our goal is to allow participants in a meeting to work with a combination of paper documents and digital media, recording activities across all media in such a way that users can later retrieve information based on keyword search, similarity search (e.g. by using handwritten sketches), timeline or association.
Start / End Dates
01.10.2009 - 30.09.2011
Partners
Prof. Moira Norrie, ETH Zürich (globis Group Website)
Funding Agencies
Swiss National Science Foundation (SNF)
Staff
Research Topics
Publications
2013
- Ihab Al Kabary, Marcel Büchler, Heiko Schuldt
TOUCHify:Bringing Pen-Based Touch Screen Functionality to Flat Panel Display Screens
Proceedings of the International Conference on Information Society (i-Society 2013), Toronto, Canada 2013/6 - Fabrice Matulic, Moira C. Norrie, Ihab Al Kabary, Heiko Schuldt
Gesture-Supported Document Creation on Pen and Touch Tabletops
Proceedings of the 31st ACM Conference on Human Factors in Computing Systems (CHI 2013), Paris, France 2013/4
2012
- Ihab Al Kabary, Heiko Schuldt
SKETCHify - an Adaptive Prominent Edge Detection Algorithm for Optimized Query-by-Sketch Image Retrieval
Proceedings of the 10th International Workshop on Adaptive Multimedia Retrieval (AMR’12), Copenhagen, Denmark 2012/10 - Ihab Al Kabary, Heiko Schuldt
Sketch-based Image Similarity Search with a Pen and Paper Interface
Proceedings of the 2012 International Conference on Research and Development in Information Retrieva, Portland, OR, USA 2012/8 - Ivan Giangreco, Michael Springmann, Ihab Al Kabary, Heiko Schuldt
A User Interface for Query-by-Sketch based Image Retrieval with Color Sketches
Proceedings of the European Conference on Information Retrieval (ECIR), Barcelona, Spain 2012/4 - Roman Kreuzer, Michael Springmann, Ihab Al Kabary, Heiko Schuldt
An Interactive Paper and Digital Pen Interface for Query-by-Sketch Image Retrieval
Proceedings of the 34th European Conference on Information Retrieval, Barcelona, Spain 2012/4 - Michael Springmann
Building Blocks for Adaptable Image Search in Digital Libaries
PhD Thesis, Department of Mathematics and Computer Science, University of Basel, Switzerland 2012/4
2010
- Michael Springmann, Ihab Al Kabary, Heiko Schuldt
Image Retrieval at Memory’s Edge:Known Image Search based on User-Drawn Sketches
Proceedings of the 19th ACM International Conference on Information and Knowledge Management (CIKM 2, Toronto, Canada 2010/10 - Michael Springmann, Ihab Al Kabary, Heiko Schuldt
Experiences with QbS:Challenges and Evaluation of Known Image Search based on User-Drawn Sketches
Technical Report CS-2010-001 Department of Computer Science, 2010/8 - Michael Springmann, Dietmar Kopp, Heiko Schuldt
QbS - Searching for Known Images using User-Drawn Sketches
Proceedings of the 11th ACM SIGMM International Conference on Multimedia Information Retrieval (MIR , Philadelphia, Pennsylvania 2010/3