Project

Back to overview

A Multimodal Framework to Express Affective Behaviour in Social Robots

English title A Multimodal Framework to Express Affective Behaviour in Social Robots
Applicant Tsiourti Christiana
Number 168609
Funding scheme Doc.Mobility
Research institution Institut für Automatisierungs- und Regelungstechnik Technische Universität Wien
Institution of higher education Institution abroad - IACH
Main discipline Information Technology
Start/End 01.11.2016 - 31.10.2017
Show all

Keywords (5)

Social Robots; Affective Human-Robot-Interaction; Affective Computing; Multimodal Interaction; Affect expression

Lay Summary (German)

Lead
In den letzten Jahren haben sogenannte soziale Roboter einen zunehmend größeren Stellenwert in unserer Gesellschaft eingenommen und koexistieren mit uns in alltäglichen Umgebungen, wie Wohnungen, Krankenhäusern, Schulen und Alten- bzw. Pflegeheimen. In den letzten Jahren haben sich die Möglichkeiten, mit denen Roboter diese affektiven Zustände ausdrücken können, stark weiterentwickelt. Insbesondere soziale Roboter nutzen Feedback-Modalitäten wie Mimik, Kopf- und Körperbewegungen und Stimme. Das beschriebene Projekt wird diese Modalitäten gezielt untersuchen und evaluieren, welche sich in der Mensch-Roboter-Interaktion am besten eignen, um bestimmte interne affektive Zustände der Roboter gezielt zu vermitteln.
Lay summary

Inhalt und Ziel des Forschungsprojektes

Inhalt und Ziel des Forschungsprojektes : Das Projekt verfolgt drei Hauptziele: (1) Basierend auf dem Stand der Forschung, soll eine allgemein gültige und reproduzierbare Methode gefunden werden, welche die Wirkung von verschiedene Modalitäten, mit denen soziale Roboter affektive Zustände ausdrücken, beurteilen zu können. (2) Die ausgewählte Methode wird in einem sogenannten Wizard-of-Oz Experiment, bei dem die Anwender/innen mit sozialen Robotern in realistischen Szenarien interagieren, validiert. In einem Interaktionsszenario wird der Roboter unterschiedlich affektiv reagieren und dabei verschiedene Kombinationen von Feedback Modalitäten nutzen. Die Reaktion der Anwender/innen auf dieses affektive Verhalten, wird mittels Fragebögen, Verhaltensobservierung, als auch nicht-invasiver psychophysiologischen Messungen untersucht. (3) Die gewonnenen Erkenntnisse sollen die Grundlage für ein allgemein gültiges Model bilden, welches es erlaubt, das individuelle Reaktionsverhalten auf affektives Verhalten von Robotern zu bestimmen.

 Wissenschaftlicher und gesellschaftlicher Kontext : Die Ziele des Forschungsprojektes sind sehr aktuell, wenn man sich das stark steigende wissenschaftliche aber auch kommerziele Interesse an Robotorn im Allgemeinen anschaut. Die Ergebnisse des Projektes können dazu beitragen, die langfrisitge Interaktion mit Robotern zu verbessern, was insbesondere in Einsatzgebieten wir Gesundheit, Altenpflege und Betreuung, als auch Bildung äusserst relevant ist. Im Vergleich zu den bisher angewandeten Interaktionsmodellen, welche meist auf allgemein gültigen Regeln basieren, die verschiedene Anwender/innen gleich behandeln, wird das weiter entwickelte Modell, insbesondere individuell Emotionen und Verhaltensvorlieben der Anwender/innen mit berücksichtigen. Ein solches Model unterstütz dem entsprechend eine sehr indivudelle Kommunikation und erlaubt eine sehr persönliche Interaktion zwische Robotern und Maschinen.

Direct link to Lay Summary Last update: 26.01.2017

Lay Summary (English)

Lead
In the last few years, social robots take on an increasingly ubiquitous role in society and begin to co-exist with us in real-world settings such as homes, hospitals, schools, and nursing homes. A fundamental requirement for sustaining long-term interactions with humans is the robot’s ability to display socially intelligent behaviour and to express its internal affective state. This project will investigate the use of multimodality to allow social robots to express their internal affective states, in a natural, recognisable and believable way, as an integral part of social interaction with human users. The project’s findings will have important implications for the design of social robots that are more engaging and thus more accepted for long-term interactions in multiple application domains (e.g., elder care, health care, education).
Lay summary
Subject and Objective
This project has three main objectives: (1) Build upon previous research to design a feasible, valid and replicable methodology to assess affective expressions of social robots. Several evaluation measures will be selected based on literature review and will be connected with quantitative and qualitative assessment methods. (2) The methodology will be validated in a Wizard-of-Oz experiment where participants interact with a social robot in realistic scenarios. The robot will express affective cues though unimodal and multimodal feedback, and the users’ reactions will be collected through questionnaires, behavioural observation and non-invasive psychophysiological sensors. The data will be analysed to measure individual reactions towards particular feedback modalities. (3) Establishing a mapping between feedback modalities and user responses will allow the systematic comparison of modalities and lead to the definition of a unified framework for the prediction of individual reactions to a robot’s expression on the basis of various elements of the feedback modalities. 
 
Socio-scientific context
The project lies at the frontier between human-robot interaction (HRI) and affective computing. The research topic is quite timely given the increasing research and commercial interest in robotics. The project’s outcomes are expected to have significant implications for the design of engaging social robots that develop long-term relationships with their users in domains such as healthcare, elderly care and education. In contrast with the conventional HRI approach for the design of affective expressions for social robots, which is based on interaction models applicable to whole population segments, the assessment methodology and the unified multimodal framework take into account individual emotional and behavioural preferences, allowing the design of tailored affective expressions offering a personalised HRI experience. 
Direct link to Lay Summary Last update: 26.01.2017

Responsible applicant and co-applicants

Publications

Publication
Designing Emotionally Expressive Robots: A Comparative Study on the Perception of Communication Modalities
Tsiourti Christiana, Weiss Astrid, Wac Katarzyna, Vincze Markus (2017), Designing Emotionally Expressive Robots: A Comparative Study on the Perception of Communication Modalities, in Proceedings of the 5th International Conference on Human Agent Interaction, Bielefeld, GermanyACM, New York, NY, USA.
Multimodal Affective Behaviour Expression: Can It Transfer Intentions?
Tsiourti Christiana, Weiss Astrid (2017), Multimodal Affective Behaviour Expression: Can It Transfer Intentions?.

Collaboration

Group / person Country
Types of collaboration
Humanoids in Architecture and Urban Spaces, Vienna University of Technology Austria (Europe)
- in-depth/constructive exchanges on approaches, methods or results

Scientific events

Active participation

Title Type of contribution Title of article or contribution Date Place Persons involved
HAI 2017 - 5th International Conference on Human-Agent Interaction Talk given at a conference Designing Emotionally Expressive Robots: A Comparative Study on the Perception of Communication Modalities 17.10.2017 CITEC, Bielefeld, Germany Tsiourti Christiana;
3rd Summer School on Social Human-Robot Interaction Poster Designing Emotionally Expressive Robots: Human Audio-Visual Perception of Robot Emotion Expressions 04.09.2017 Vila Nova de Milfontes, Portugal Tsiourti Christiana;
Workshop Barriers of Social Robotics take-up by Society Talk given at a conference Invited Speaker- No article 01.09.2017 Lisbon, Portugal Tsiourti Christiana;
HRI 2017 Workshop on “The Role of Intentions in Human-Robot Interaction” Talk given at a conference Multimodal Affective Behaviour Expression: Can It Transfer Intentions? 06.03.2017 Vienna, Austria Tsiourti Christiana;


Self-organised

Title Date Place

Communication with the public

Abstract

Robotics is a key technology for Europe’s future competitiveness and the next years will set the stage for how social robots could fundamentally transform our lives in domains such as healthcare, elderly care and education. However diverse the application might be, a crucial requirement is that social robots, beyond their short-term engaging effect due to novelty, should be accepted and able to establish long-term relationships with humans. One of the main features extensively used to engage users in Human-Robot-Interaction (HRI) is the expression of affect as an integral part of human social interaction. The range of possibilities for affect expression from humanoid social robots has grown tremendously over the past years. Many researchers studied verbal and nonverbal feedback, but there is little research on the use of multimodality to convey affective information in a more recognisable and believable manner. This project aims to fill this gap, and the main research interest is threefold: the project will build upon previous research to design a new feasible, valid and replicable methodology, leveraging mixed methods, to assess the recognisability and believability of multimodal affective expressions of social robots. The new methodology will be validated in a Wizard-of-Oz experimental study, where participants will interact with a social robot, to systematically compare three different feedback modalities and to validate the effectiveness of multimodality as a mechanism to convey affective states. The study will employ quantitative and qualitative methods used in HRI to capture assessment data. Also, psychophysiological measurements will be collected during the interaction with the robot, to conduct a preliminary analysis of the influence of affective feedback on the affective state of the users. Overall, the study will provide empirical data towards the definition of a unified multimodal framework to express affective behaviour by social robots, across facial expression, speech and motion modalities. The framework will have important implications for the design of richer and more compelling affective expressions for social robots facilitating long-term acceptance in multiple application domains (e.g., elder care, healthcare, education).
-