Project

Back to overview

Intention-Aware Gaze-Based Assistance on Maps

English title Intention-Aware Gaze-Based Assistance on Maps
Applicant Kiefer Peter
Number 162886
Funding scheme Project funding (Div. I-III)
Research institution Institut für Kartografie und Geoinformation ETH Zürich Departement Bau, Umwelt und Geomatik
Institution of higher education ETH Zurich - ETHZ
Main discipline Information Technology
Start/End 01.02.2016 - 31.01.2019
Approved amount 179'308.00
Show all

Keywords (7)

Map Adaptation; Eye Tracking; Gaze-Based Assistance; Intention Recognition; Gaze-Based Interaction; Activity Recognition; Context

Lay Summary (German)

Lead
Adaptive Mensch-Computer-Schnittstellen können die Interaktion mit digitalen Karten signifikant vereinfachen, beispielsweise bei der Navigation, der Informationssuche, oder für Nutzer mit körperlichen Einschränkungen. Ein Kartensystem, welches das Informationsbedürfnis seiner Nutzer schnell und akkurat erkennt, könnte bestimmte Karteninhalte ein- bzw. ausblenden und somit die Effektivität und Effizienz der Interaktion erhöhen. Da Karten weitestgehend visuell wahrgenommen und exploriert werden, liegt die Vermutung nahe, dass das Blickverhalten - gemessen durch "Eye Tracker" genannte Kamerasysteme - zur Erkennung der Intentionen und Aktivitäten, und somit als Auslöser für Kartenadaption genutzt werden kann.
Lay summary

Ziele des Forschungsprojekts

Das Hauptziel des Forschungsprojekts besteht in der Erforschung von Methoden zur Erkennung von Aktivitäten und Intentionen aus visuellem Verhalten bei der Interaktion mit Karten. Hierbei werden die folgenden Fragestellungen beantwortet:

1. Kann man aus dem Blickverhalten eines Kartennutzers auf dessen Aktivität bzw. Intention, und somit auf sein Informationsbedürfnis Rückschluss ziehen (z.B. Routenplanung, Suche nach einem Restaurant)?

2. Wie gestaltet man eine hilfreiche Kartenadaption?

3. Wie ist die allgemeine Nutzerakzeptanz bzgl. blickbasierter Intentionserkennung auf Karten einzuschätzen?

Wissenschaftlicher und gesellschaftlicher Kontext des Forschungsprojekts

Das Projekt wird neue informatische Ansätze zur Erkennung von Intentionen aus Blickdaten bei der Nutzung von Karten entwickeln. Diese Ergebnisse werden sowohl einen Einfluss auf zukünftige Standards im Bereich Mensch-Computer-Interaktion haben, als auch den Methodenkatalog der künstlichen Intelligenz zur Erstellung von Assistenzsystemen erweitern. Die im Projekt gewonnenen Erkenntnisse über das Verhalten und die Intentionen von Menschen bei der Nutzung digitaler Karten werden eine wertvolle Grundlage für weitere Forschungen im Bereich der Raumkognition darstellen.


Direct link to Lay Summary Last update: 22.10.2015

Responsible applicant and co-applicants

Employees

Name Institute

Publications

Publication
FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps
Göbel Fabian, Kiefer Peter, Raubal Martin (2019), FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps, in GeoInformatica, 1.
A public gaze-controlled campus map
GöbelFabian, BakogioannisNikolaos, HenggelerKatharina, TschümperlinRoswita, XuYang, KieferPeter, RaubalMartin (2018), A public gaze-controlled campus map, in ET4S Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop, Zürich, SwitzerlandETH Zürich, Zürich, Switzerland.
Challenges in gaze-based intention recognition
KieferPeter (2018), Challenges in gaze-based intention recognition, in Duchowski Andrew, Chuang Lewis, Weiskopf Daniel, Qvarfordt Pernilla (ed.), Dagstuhl Publishing, Dagstuhl, Germany, 44.
Detecting Mindless Gaze
RaubalMartin (2018), Detecting Mindless Gaze, in Chuang Lewis, Duchowski Andrew, Weiskopf Daniel, Qvarfordt Pernilla (ed.), Dagstuhl Publishing, Dagstuhl, Germany, 143.
ET4S Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop
Kiefer Peter, Giannopoulos Ioannis, Raubal Martin, Göbel Fabian, Duchowski Andrew T. (ed.) (2018), ET4S Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop, ETH Zürich, Zürich, Switzerland.
Gaze Sequences and Map Task Complexity
GöbelFabian, KieferPeter, GiannopoulosIoannis, RaubalMartin (2018), Gaze Sequences and Map Task Complexity, in 10th International Conference on Geographic Information Science (GIScience 2018), Melbourne, AustraliaSchloss Dagstuhl - Leibniz-Zentrum fuer Informatik, Dagstuhl, Germany.
Improving map reading with gaze-adaptive legends
Göbel Fabian, Kiefer Peter, Giannopoulos Ioannis, Duchowski Andrew T., Raubal Martin (2018), Improving map reading with gaze-adaptive legends, in the 2018 ACM Symposium, Warsaw, PolandACM, New York.
The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation
Duchowski Andrew T., Krejtz Krzysztof, Krejtz Izabela, Biele Cezary, Niedzielska Anna, Kiefer Peter, Raubal Martin, Giannopoulos Ioannis (2018), The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation, in the 2018 CHI Conference, Montreal QC, CanadaACM, New York.
Unsupervised Clustering of Eye Tracking Data
GöbelFabian, MartinHenry (2018), Unsupervised Clustering of Eye Tracking Data, in Spatial Big Data and Machine Learning in GIScience, Workshop at GIScience 2018, Melbourne, AustraliaETH Zürich, Zürich, Switzerland.
Controllability matters: The user experience of adaptive maps
Kiefer Peter, Giannopoulos Ioannis, Anagnostopoulos Vasileios Athanasios, Schöning Johannes, Raubal Martin (2017), Controllability matters: The user experience of adaptive maps, in GeoInformatica, 21(3), 619-641.
Eye tracking for spatial research: Cognition, computation, challenges
Kiefer Peter, Giannopoulos Ioannis, Raubal Martin, Duchowski Andrew (2017), Eye tracking for spatial research: Cognition, computation, challenges, in Spatial Cognition & Computation, 17(1-2), 1-19.
FeaturEyeTrack: A vector tile-based eye tracking framework for interactive maps
GöbelFabian, KieferPeter, RaubalMartin (2017), FeaturEyeTrack: A vector tile-based eye tracking framework for interactive maps, in Short papers, posters and poster abstracts of the 20th AGILE Conference on Geographic Information Sc, Wageningen University & Research, Wageningen, The Netherlands.
Measuring cognitive load for map tasks through pupil diameter
Kiefer Peter, Giannopoulos Ioannis, Duchowski Andrew, Raubal Martin (2016), Measuring cognitive load for map tasks through pupil diameter, in Geographic Information Science. GIScience 2016, Montréal, CanadaSpringer International Publishing, Cham.
The importance of visual attention for adaptive interfaces
Göbel Fabian, Giannopoulos Ioannis, Raubal Martin (2016), The importance of visual attention for adaptive interfaces, in the 18th International Conference, Florence, ItalyACM, New York.
Look There! Be Social and Share
GöbelFabian, KwokTiffany C.K., RudiDavid, Look There! Be Social and Share, in Challenges Using Head-Mounted Displays in Shared and Social Spaces, Workshop at CHI 2019, Glasgow, U.K.ETH Zürich, Zürich, Switzerland.
POI-Track: Improving Map-Based Planning with Implicit POI Tracking
GöbelFabian, KieferPeter, POI-Track: Improving Map-Based Planning with Implicit POI Tracking, in Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA'19), Denver, Colorado, USAACM, New York.

Collaboration

Group / person Country
Types of collaboration
Dr. Ionut Iosifescu, Swiss Federal Institute for Forest, Snow and Landscape Research WSL Switzerland (Europe)
- in-depth/constructive exchanges on approaches, methods or results
Prof. Dr. Johannes Schöning, Human-computer interaction, University of Bremen Germany (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
Prof. Dr. Andreas Bulling, Human-Computer Interaction and Cognitive Systems, University of Stuttgart Germany (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
Prof. Dr. Andrew Duchowski, Clemson University, Clemson, SC United States of America (North America)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
- Exchange of personnel

Scientific events

Active participation

Title Type of contribution Title of article or contribution Date Place Persons involved
The 10th International Conference on Geographic Information Science (GIScience 2018) Talk given at a conference Gaze Sequences and Map Task Complexity 28.08.2018 Melbourne, Australia Raubal Martin; Göbel Fabian; Kiefer Peter;
Spatial Big Data and Machine Learning in GIScience Talk given at a conference Unsupervised Clustering of Eye Tracking Data 28.08.2018 Melbourne, Australia Göbel Fabian;
Colloquium Cognitive Systems, Ulm University Individual talk Gaze-based assistance for spatial decision making 28.06.2018 Ulm, Germany Kiefer Peter;
ETVIS 2018 Third Workshop on Eye Tracking and Visualization Talk given at a conference Panel discussion: "How can visualization make a larger contribution to ETRA?" 16.06.2018 Warschau, Poland Kiefer Peter;
2018 ACM Symposium on Eye Tracking Research & Applications (ETRA 2018) Talk given at a conference Improving Map Reading with Gaze-Adaptive Legends 14.06.2018 Warschau, Poland Kiefer Peter; Raubal Martin; Göbel Fabian;
ACM CHI Conference on Human Factors in Computing Systems (CHI 2018) Talk given at a conference The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation 24.04.2018 Montréal, Canada Raubal Martin; Kiefer Peter;
3rd International Workshop on Eye Tracking for Spatial Research (ET4S 2018) Poster A Public Gaze-Controlled Campus Map 14.01.2018 Zürich, Switzerland Göbel Fabian;
20th AGILE conference on Geographic Information Science Talk given at a conference FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps 09.05.2017 Wageningen, Netherlands Kiefer Peter; Raubal Martin; Göbel Fabian;
The 9th International Conference on Geographic Information Science (GIScience 2016) Talk given at a conference Measuring Cognitive Load for Map Tasks through Pupil Diameter 28.09.2016 Montréal, Canada Kiefer Peter; Raubal Martin;
Smarttention, please!: 2nd workshop on intelligent attention management on mobile devices, Mobile HCI 2016 Talk given at a conference The Importance of Visual Attention for Adaptive Interfaces 06.09.2016 Florenz, Italy Raubal Martin; Göbel Fabian;
GI-Forum Universität Münster Individual talk Gaze-Based Spatial Interaction 03.06.2016 Münster, Germany Raubal Martin; Kiefer Peter;


Self-organised

Title Date Place
Eye Tracking Interest Group Zurich: Focus Meeting on "Cognitive Load" 23.03.2016 Zürich, Switzerland

Knowledge transfer events

Active participation

Title Type of contribution Date Place Persons involved
GeoSummit 2018 Performances, exhibitions (e.g. for education institutions) 05.06.2018 Bern, Switzerland Kiefer Peter; Göbel Fabian;
GeoSummit 2016 Performances, exhibitions (e.g. for education institutions) 08.06.2016 Bern, Switzerland Göbel Fabian;


Communication with the public

Communication Title Media Place Year
Talks/events/exhibitions Visuelle Aufmerksamkeit und adaptive Karten (FHNW 2018) German-speaking Switzerland 2018
Talks/events/exhibitions Was Blicke verraten - Eye Tracking (GeoSchoolDay, GeoSummit 2018) International 2018
Talks/events/exhibitions Was Blicke verraten - Eye Tracking (Treffpunkt Science City) German-speaking Switzerland 2018
Talks/events/exhibitions Visuelle Aufmerksamkeit und adaptive Karten (FHNW 2017) German-speaking Switzerland 2017
Talks/events/exhibitions Blickgesteuertes Geo-Spiel (GeoSchoolDay, GeoSummit 2016) International 2016
Talks/events/exhibitions Visuelle Aufmerksamkeit und adaptive Karten (FHNW 2016) German-speaking Switzerland 2016

Abstract

The concept of “one map fits all users” is surely obsolete. Adaptive and context-aware technologies have facilitated digital maps which offer personalized assistance by sensing, inferring, and utilizing the map user’s environmental, technological, and user context. Map adaptation is specifically helpful when the interaction possibilities with the map are restricted, or when time constraints apply. Despite maps becoming increasingly adaptive, the semantic gap between the contextual information used for adaptation and the user’s high-level cognitive processes that should finally be supported remains high. If a map-based assistance system could reliably infer a user’s intentions, it would be able to provide highly personalized assistance and support the user’s goals in an optimal way. Since vision is the primary sense for perceiving cartographic maps the user’s gaze behavior while interacting with a map is likely to provide valuable information on the user’s cognitive states in terms of intentions. With eye tracking technology it is possible to sense the user’s gaze on a cartographic map, relate it to the map contents, and use the acquired information for gaze-based adaptation. The proposed project envisions intention-aware gaze-based assistance on cartographic maps. A future intention-aware gaze-based assistive map could, for instance, recognize from the user’s gaze that he or she is planning a touristic round trip, and adapt to the user’s needs accordingly. The main objective of this project consists in the investigation of methods for the recognition of activities and intentions from gaze data, collected from cartographic map users. The project intends to investigate the following research questions:1.How can visible map features (points, lines, polygons) be used to assign meaning to a user’s gaze on a cartographic map?2.How can a map user’s activities be recognized from an incomplete gaze track (i.e., online), and how can a change in the performed activity be detected (segmentation problem)?3.How can the recognized basic activities be combined and interpreted at a higher semantic level, i.e., with respect to the map user’s intentions?Methods used in geographic data processing (e.g., map matching), mobile assistance systems (e.g., trajectory interpretation), activity recognition (e.g., machine learning from gaze tracks), and general plan recognition (e.g., Dynamic Bayes Models) will be extended and adapted to recognize a map user’s activities and intentions from gaze. The expected outcomes of the project include a) algorithmic approaches for online geometric matching of gaze with cartographic vector features, b) an evaluation of machine learning techniques for the recognition and segmentation of activities on cartographic maps, c) a methodology for the recognition of intentions from gaze on cartographic maps, d) an annotated eye tracking corpus, to be published online with the goal of stimulating further research in the eye tracking, artificial intelligence (A.I.), and geographic information science (GIScience) research communities. The eye tracking corpus will be created by collecting gaze data with studies in a controlled experimental setting (remote eye tracking) using various cartographic stimuli and tasks.
-