Project

Back to overview

Matching Gaze Patterns and Interaction Patterns in Collaborative Tasks

English title Matching Gaze Patterns and Interaction Patterns in Collaborative Tasks
Applicant Dillenbourg Pierre
Number 117909
Funding scheme Interdisciplinary projects
Research institution Centre de Recherche et d'Appui pour la Formation et ses Technologies EPFL - CRAFT
Institution of higher education EPF Lausanne - EPFL
Main discipline Psychology
Start/End 01.01.2008 - 31.12.2010
Approved amount 262'500.00
Show all

All Disciplines (2)

Discipline
Psychology
Information Technology

Keywords (7)

Human-Computer Interaction; Computer-Supported Collaborative; Learning; Eye tracking; collaborative learning; group cognition; human computer interaction

Lay Summary (English)

Lead
Lay summary
We propose to use 2 eye tracking devices to investigate collaborative mechanisms at a depth that they have not been studied before. Eye tracking methods are not new; what is new that they are now stable enough to investigate complex phenomena such as collaborative problem solving and collaborative learning. This topic lies at the frontier between computer science (groupware research and machine learning) and cognitive science (learning mechanisms and gaze analysis). We propose to record gaze paths in collaborative learning experiments and to use machine learning methods to relate gaze patterns to collaboration processes. This relationship will be explored at two levels. At the macro-level, global gaze parameters or patterns will be related to global measures of effective-ness and satisfaction in collaborative learning. At the micro-level, gaze patterns will be related to verbal interaction patterns that are known to contribute to collaborative learning such as explana-tion episodes, conflict episodes, grounding episodes, etc. At both level, we will use both supervised learning algorithms (finding shared properties to gaze records that correspond to an un/effective session or to a specific episode) and unsupervised learning algorithms (clustering/segmenting gaze records in consistent clusters and verifying in these clusters correspond to un/effective sessions or to specific episodes). The identified patterns will be used for developing two types of gaze-sensitive applications, one displaying user-A's gaze to user-B in real time and the second one displaying to the team some visualisation of the quality of their collaboration. The project will end with new experiments for assessing the impact of these applications on collaborative learning mechanisms.
Direct link to Lay Summary Last update: 21.02.2013

Responsible applicant and co-applicants

Employees

Associated projects

Number Title Start Funding scheme
126611 Multimodal Interaction Modelling and Regulation of Collaborative Problem-Solving 01.01.2010 Ambizione
132996 Matching Gaze Patterns and Interaction Patterns in Collaborative Tasks (II) 01.01.2011 Interdisciplinary projects

-