Projekt

Zurück zur Übersicht

PAD-IR: Paper-Digital System for Information Capture and Retrieval

Titel Englisch PAD-IR: Paper-Digital System for Information Capture and Retrieval
Gesuchsteller/in Schuldt Heiko
Nummer 126829
Förderungsinstrument Projektförderung (Abt. I-III)
Forschungseinrichtung Fachbereich Informatik Departement Mathematik und Informatik Universität Basel
Hochschule Universität Basel – BS
Hauptdisziplin Informatik
Beginn/Ende 01.10.2009 - 30.09.2011
Bewilligter Betrag 222'942.00
Alle Daten anzeigen

Keywords (10)

Digital Libraries; Interactive paper; Semantic annotation; Content-based Image Retrieval (CBIR); Image Similarity Search; Information Retrieval; Human-Computer Interaction; Audio Retrieval; digital pen; content-based multimedia retrieval

Lay Summary (Englisch)

Lead
Lay summary
The PAD-IR project will extend the notion of paper-digital retrieval systems beyond that of a paper-based interface to an information retrieval (IR) system in order to truly bridge the paper-digital divide by allowing retrieval across different forms of media, including handwritten notes and sketches. Thus, it will not simply be a case of formulating queries on paper, but also being able to digitally capture information on paper and link it to various forms of digital media based on the semantics of that information and also the context in which it was captured. Objects need to be managed together with their metadata including links between objects, the context of their acquisition and content features. Retrieval may then be based on queries specified digitally or on paper, or even some combination of both. Since queries might encompass several media types and the additional object meta data, dedicated algorithms to effectively search in these media types have to be available as basic building blocks. Query processing will consist of the automatic, individual composition of the necessary building blocks, in a way which is completely transparent to the user.The application settings that PAD-IR will explore will include various kinds of meeting scenarios as well as post-meeting retrieval of information. During meetings, users often work with several paper and digital documents including handwritten notes taken by individual participants, sketches used as part of collaborative design processes and presentation tools. Although there are existing tools to help record meeting sessions, they tend to focus either on digital recordings such as a combination of audio, video and presentations or solely on the recording of handwritten notes synchronised with audio recordings. Our goal is to allow participants in a meeting to work with a combination of paper documents and digital media, recording activities across all media in such a way that users can later retrieve information based on keyword search, similarity search (e.g. by using handwritten sketches), timeline or association.
Direktlink auf Lay Summary Letzte Aktualisierung: 21.02.2013

Verantw. Gesuchsteller/in und weitere Gesuchstellende

Mitarbeitende

Publikationen

Publikation
A Model and Architecture for Open Cross-Media Annotation and Link Services
Signer Beat, Norrie Moira (2011), A Model and Architecture for Open Cross-Media Annotation and Link Services, in Information Systems Journal, 36(3), 538-550.
An extensible digital ink segmentation and classification framework for natural notetaking
Ispas Adriana, Signer Beat, Norrie Moira C. (2011), An extensible digital ink segmentation and classification framework for natural notetaking, in Proceedings of the 3rd ACM SIGCHI Symposium on Engineering Interactive Computing System, Pisa, ItalyACM Press, New York, NY, USA.
A Study of Incidental Notetaking to Inform Digital Pen and Paper Solutions
Ispas A, Signer B., Norrie M. (2010), A Study of Incidental Notetaking to Inform Digital Pen and Paper Solutions, in Proceedings of the 2010 British Computer Society Conference on Human-Computer Interaction, Dundee, United KingdomACM Press, New York, NY, USA.
Experiences with QbS: Challenges and Evaluation of Known Image Search based on User-Drawn Sketches (Technical Report)
Springmann Michael, Al Kabary Ihab, Schuldt Heiko (2010), Experiences with QbS: Challenges and Evaluation of Known Image Search based on User-Drawn Sketches (Technical Report), University of Basel, Basel.
Image Retrieval at Memory's Edge: Known Image Search based on User-Drawn Sketches
Springmann Michael, Al Kabary Ihab, Schuldt Heiko (2010), Image Retrieval at Memory's Edge: Known Image Search based on User-Drawn Sketches, in Proceedings of the 19th ACM International Conference on Information and Knowledge Management, Toronto, CanadaACM Press, New York, NY, USA.
Interactive Paper: Past, Present and Future
Signer B., Norrie M. (2010), Interactive Paper: Past, Present and Future, in Ubicomp, Copenhagen, DenmarkACM Press, New York, NY, USA.
Paper-Digital Meeting Support and Review
Ispas Adriana, Li N., Norrie Moira, Signer Beat (2010), Paper-Digital Meeting Support and Review, in 6th International Conference on Collaborative Computing: Networking, Applications and Worksharing, Chicago, IL, USAIEEE Xplore, New York, NY, USA.
QbS - Searching for Known Images using User-Drawn Sketches
Springmann Michael, Kopp Dietmar, Schuldt Heiko (2010), QbS - Searching for Known Images using User-Drawn Sketches, in Proceedings of the 11th ACM SIGMM International Conference on Multimedia Information Retrieval, Philadelphia, PA, USAACM Press, New York, NY, USA.

Wissenschaftliche Veranstaltungen

Aktiver Beitrag

Titel Art des Beitrags Titel des Artikels oder Beitrages Datum Ort Beteiligte Personen
19th ACM International Conference on Information and Knowledge Management (CIKM 2010) Vortrag im Rahmen einer Tagung Image Retrieval at Memory’s Edge:Known Image Search based on User-Drawn Sketches 26.10.2010 Toronto, Canada, Kanada El-Kabary Ihad;
11th ACM SIG Multimedia International Conference on Multimedia Information Retrieval (MIR 2010) Vortrag im Rahmen einer Tagung QbS - Searching for Known Images using User-Drawn Sketches 26.03.2010 Philadelphia, PA, USA, Vereinigte Staaten von Amerika Schuldt Heiko;


Selber organisiert

Titel Datum Ort

Veranstaltungen zum Wissenstransfer



Selber organisiert

Titel Datum Ort
Vorlesung Multimedia Retrieval (CS342) 14.02.2011 Basel, Schweiz

Kommunikation mit der Öffentlichkeit

Kommunikation Titel Medien Ort Jahr
Referate/Veranstaltungen/Ausstellungen Projektpräsentation Studierendeninformationstag Universität Basel Deutschschweiz 2011
Referate/Veranstaltungen/Ausstellungen Projektpräsentation 550-Jahrfeier Universität Basel - Hauptveranstaltung Basel Deutschschweiz 2010
Referate/Veranstaltungen/Ausstellungen Projektpräsentation 550-Jahrfeier Universität Basel in Aarau Deutschschweiz 2010
Referate/Veranstaltungen/Ausstellungen Projektpräsentation 550-Jahrfeier Universität Basel in Liestal Deutschschweiz 2010
Referate/Veranstaltungen/Ausstellungen Projektpräsentation 550-Jahrfeier Universität Basel in Solothurn Deutschschweiz 2010

Verbundene Projekte

Nummer Titel Start Förderungsinstrument
137944 MM-DocTable: Multimedia Document Engineering Workflows on Tabletop Devices 01.10.2011 Projektförderung (Abt. I-III)
117800 Query by Sketching (QbS) 01.10.2007 Projektförderung (Abt. I-III)

Abstract

The project is a continuation of SNF Project 117800 QbS: Query-by-Sketching in which we investigated a paper-based interface to an image retrieval engine. In the context of the QbS project, we explored ways in which new digital pen and paper technologies couldbe used to formulate and/or refine queries on paper by means of sketching and the user-defined selection of regions of interest, both combined with handwritten texts and gestures. In this continuation, we want to extend our notion of paper-digital retrieval systems beyond that of a paper-based interface to an information retrieval(IR) system in order that we can truly bridge the paper-digital divide by allowing retrieval across different forms of media, including handwritten notes and sketches. Thus, it will not simply be a case of formulating queries on paper, but also being able to digitally capture information on paper and link it to various forms of digital media based on the semantics of that information and also the context in which it was captured. Objects need to be managed together with their metadata including links between objects, the context of their acquisition and content features. Retrieval may then be based on queries specified digitally or on paper, or even some combination of both. Since IR queries might encompass severalmedia types and the additional object meta data, dedicated algorithms to effectively search in these media types have to be available as basic building blocks. Query processing will consist of the automatic, individual composition of the necessary building blocks, in a way which is completely transparent to the user.The application settings that we will explore will include various kinds ofmeeting scenarios as well as post-meeting retrieval of information. Duringmeetings, users often work with several paper and digital documents including handwritten notes taken by individual participants, sketches used as part of collaborative design processes and presentation tools. Although there are existing tools to help record meeting sessions, they tend to focus either on digital recordings such as a combination of audio, video andpresentations or solely on the recording of handwritten notes synchronisedwith audio recordings. Our goal is to allow participants in a meeting to work with a combination of paper documents and digital media, recording activities across all media in such a way that users can later retrieve information based on keyword search, similarity search (e.g., by using handwritten sketches), timeline or association. To achieve this goal, the project will consist of four main parts. First, wewill investigate how existing tools for capture and interaction with paper documents can best be combined with the kinds of functionality currently supported by tools for recording meetings. Second, we will analyze which basic building blocks for search and retrieval in these digital collectionswill be needed and provide new building blocks whenever necessary. Third, we will investigate ways in which these building blocks can be automatically combined in order to implement user defined queries in a general platform for cross-media information retrieval. Fourth, we will experiment with ways of aggregating and retrieving information captured and accessed during meetings in order to provide users with a rich set of tools to support their activities during and after meetings.
-