Digital Libraries, Interactive paper, Semantic annotation, Content-based Image Retrieval (CBIR), Image Similarity Search, Information Retrieval, Human-Computer Interaction, Audio Retrieval, digital pen, content-based multimedia retrieval
Springmann Michael, Al Kabary Ihab, Schuldt Heiko (2010), Experiences with QbS: Challenges and Evaluation of Known Image Search based on User-Drawn Sketches (Technical Report)
, University of Basel, Basel.
Springmann Michael, Al Kabary Ihab, Schuldt Heiko (2010), Image Retrieval at Memory's Edge: Known Image Search based on User-Drawn Sketches, in Proceedings of the 19th ACM International Conference on Information and Knowledge Management
, Toronto, CanadaACM Press, New York, NY, USA.
Springmann Michael, Kopp Dietmar, Schuldt Heiko (2010), QbS - Searching for Known Images using User-Drawn Sketches, in Proceedings of the 11th ACM SIGMM International Conference on Multimedia Information Retrieval
, Philadelphia, PA, USAACM Press, New York, NY, USA.
Ispas Adriana, Signer Beat, Norrie Moira C. (2011), An extensible digital ink segmentation and classification framework for natural notetaking, in Proceedings of the 3rd ACM SIGCHI Symposium on Engineering Interactive Computing System
, Pisa, ItalyACM Press, New York, NY, USA.
Ispas Adriana, Li N., Norrie Moira, Signer Beat (2010), Paper-Digital Meeting Support and Review, in 6th International Conference on Collaborative Computing: Networking, Applications and Worksharing
, Chicago, IL, USAIEEE Xplore, New York, NY, USA.
Ispas A, Signer B., Norrie M. (2010), A Study of Incidental Notetaking to Inform Digital Pen and Paper Solutions, in Proceedings of the 2010 British Computer Society Conference on Human-Computer Interaction
, Dundee, United KingdomACM Press, New York, NY, USA.
Signer B., Norrie M. (2010), Interactive Paper: Past, Present and Future, in Ubicomp
, Copenhagen, DenmarkACM Press, New York, NY, USA.
Signer Beat, Norrie Moira (2011), A Model and Architecture for Open Cross-Media Annotation and Link Services, in Information Systems Journal
, 36(3), 538-550.
The project is a continuation of SNF Project 117800 QbS:
Query-by-Sketching in which we investigated a paper-based interface
to an image retrieval engine. In the context of the QbS project,
we explored ways in which new digital pen and paper technologies could
be used to formulate and/or refine queries on paper by means of
sketching and the user-defined selection of regions of interest, both
combined with handwritten texts and gestures.
In this continuation, we want to extend our notion of paper-digital retrieval
systems beyond that of a paper-based interface to an information retrieval
(IR) system in order that we can truly bridge the paper-digital divide by
allowing retrieval across different forms of media, including handwritten
notes and sketches. Thus, it will not simply be a case of formulating
queries on paper, but also being able to digitally capture information on
paper and link it to various forms of digital media based on the semantics
of that information and also the context in which it was captured. Objects
need to be managed together with their metadata including links between
objects, the context of their acquisition and content features. Retrieval
may then be based on queries specified digitally or on paper, or even
some combination of both. Since IR queries might encompass several
media types and the additional object meta data, dedicated algorithms to
effectively search in these media types have to be available as basic
building blocks. Query processing will consist of the automatic, individual
composition of the necessary building blocks, in a way which is completely
transparent to the user.
The application settings that we will explore will include various kinds of
meeting scenarios as well as post-meeting retrieval of information. During
meetings, users often work with several paper and digital documents
including handwritten notes taken by individual participants, sketches used
as part of collaborative design processes and presentation tools. Although
there are existing tools to help record meeting sessions, they tend to focus
either on digital recordings such as a combination of audio, video and
presentations or solely on the recording of handwritten notes synchronised
with audio recordings. Our goal is to allow participants in a meeting to
work with a combination of paper documents and digital media, recording
activities across all media in such a way that users can later retrieve
information based on keyword search, similarity search (e.g., by using
handwritten sketches), timeline or association.
To achieve this goal, the project will consist of four main parts. First, we
will investigate how existing tools for capture and interaction with paper
documents can best be combined with the kinds of functionality currently
supported by tools for recording meetings. Second, we will analyze which
basic building blocks for search and retrieval in these digital collections
will be needed and provide new building blocks whenever necessary.
Third, we will investigate ways in which these building blocks can be
automatically combined in order to implement user defined queries in a
general platform for cross-media information retrieval. Fourth, we will
experiment with ways of aggregating and retrieving information captured
and accessed during meetings in order to provide users with a rich set of
tools to support their activities during and after meetings.