Project

Back to overview

Multimodal Computational Modeling of Nonverbal Social Behavior in Face to Face Interaction

Applicant Aran Oya
Number 136811
Funding scheme Ambizione
Research institution IDIAP Institut de Recherche
Institution of higher education Idiap Research Institute - IDIAP
Main discipline Information Technology
Start/End 01.12.2011 - 31.12.2014
Approved amount 391'567.00
Show all

All Disciplines (3)

Discipline
Information Technology
Sociology
Psychology

Keywords (5)

social computing; nonverbal behavior; multimodal modeling; social interaction modeling; social media

Lay Summary (English)

Lead
Lay summary
This project proposes to build computational models of social constructs that define the social behavior of individuals and groups in face to face conversations, perceived via audio, visual or mobile sensors. The aim is to automatically analyze the social behavior of individuals during their interaction through their nonverbal signals and build models to estimate several social concepts using machine learning techniques. The novelty of the proposed approach is that it investigates computational approaches that make use of the close relation between related social constructs, such as dominance and leadership, or personality and dominance, during the learning process. The assumption is that these social constructs are related, thus automatic inference for one concept can take advantage of  the other.  The project follows a joint learning approach that combines the individual characteristics of participants in a group, such as personality and mood, with their social position in the group, such as dominance or roles, resulting from intra-group interaction and relations, as well as the overall group structure. Another novelty of the proposal is related to the usage of social media content to learn social behavior of individuals. Unlike the limited amount of data that is used to build computational models of social behavior, the social media sites provide an excellent and a vast amount of data for natural human behavior. The project aims to transfer the knowledge that can be extracted from the audio-visual behavioral content in social media (i.e. video blogging sites, video discussion sites, video lectures sites, etc.) to small group settings.
Direct link to Lay Summary Last update: 21.02.2013

Responsible applicant and co-applicants

Name Institute

Employees

Publications

Publication
Predicting the Performance in Decision-Making Tasks: From Individual Cues to Group Interaction
Avci Umut and Aran Oya (2016), Predicting the Performance in Decision-Making Tasks: From Individual Cues to Group Interaction, in IEEE Transactions on Multimedia, 18(4), 643-658.
Personality Trait Classification via Co-Occurrent Multiparty Multimodal Event Discovery
Okada Shogo, Aran Oya, Gatica-Perez Daniel (2015), Personality Trait Classification via Co-Occurrent Multiparty Multimodal Event Discovery, in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, Seattle, WA, USAACM, New York, NY, USA.
Broadcasting oneself: Visual Discovery of Vlogging Styles
Aran Oya, Biel Joan-Isaac, Gatica-Perez Daniel (2014), Broadcasting oneself: Visual Discovery of Vlogging Styles, in IEEE Transactions on Multimedia, 16(1), 201-215.
Effect of nonverbal behavioral patterns on the performance of small groups
Avci Umut, Aran Oya (2014), Effect of nonverbal behavioral patterns on the performance of small groups, in Proceedings of the 2014 Workshop on Understanding and Modeling Multiparty, Multimodal Interactions, ACM Newyork, USA.
How Do You Like Your Virtual Agent?: Human-Agent Interaction Experience through Nonverbal Features and Personality Traits
Cerekovic Aleksandra, Aran Oya, Gatica-Perez Daniel (2014), How Do You Like Your Virtual Agent?: Human-Agent Interaction Experience through Nonverbal Features and Personality Traits, in Human Behavior Understanding, Springer, US.
Creative Applications of Human Behavior Understanding
Salah Albert Ali, Hung Hayley, Aran Oya, Gunes Hatice (2013), Creative Applications of Human Behavior Understanding, in Human Behavior Understanding, Lecture Notes in Computer Science, 8212, 1-14.
Cross-domain personality prediction: from video blogs to small group meetings
Aran Oya, Gatica-Perez Daniel (2013), Cross-domain personality prediction: from video blogs to small group meetings, in International conference on multimodal interaction, ICMI 2013, Sydney, AustraliaACM, Newyork, USA.
Human Behavior Understanding - 4th International Workshop, HBU 2013, Barcelona, Spain, October 22, 2013. Lecture Notes in Computer Science 8212
Salah Albert Ali (ed.) (2013), Human Behavior Understanding - 4th International Workshop, HBU 2013, Barcelona, Spain, October 22, 2013. Lecture Notes in Computer Science 8212, Springer, Switzerland.
One of a kind: inferring personality impressions in meetings
Aran Oya, Gatica-Perez Daniel (2013), One of a kind: inferring personality impressions in meetings, in International Conference on Multimodal Interaction, ICMI 2013, Sydney, AustraliaACM, Newyork USA.
Vision based personality analysis using transfer learning methods
Kindiroǧlu Ahmet Alp, Akarun Lale, Aran Oya (2013), Vision based personality analysis using transfer learning methods, in Signal Processing and Communications Applications Conference (SIU), 2014 22nd, Trabzon, TurkeyIEEE, USA.
Emergent leaders through looking and speaking: from audio-visual data to multimodal recognition
Sanchez-Cortes Dairazalia, Aran Oya, Jayagopi Dinesh Babu, Schmid Mast Marianne, Gatica-Perez Daniel (2012), Emergent leaders through looking and speaking: from audio-visual data to multimodal recognition, in Journal on Multimodal User Interfaces, 1.
Modeling dominance effects on nonverbal behaviors using granger causality
Kalimeri Kyriaki, Lepri Bruno, Aran Oya, Jayagopi Dinesh Babu, Gatica-Perez Daniel, Pianesi Fabio (2012), Modeling dominance effects on nonverbal behaviors using granger causality, in Proceedings of International Conference on Multimodal Interaction, ICMI 2012, Santa Monica, CAACM, Newyork, USA.
Automatic Sign Language Recognition and Applications for Turkish Sign Language (in Turkish)
Aran Oya, Ari Ismail, Kindiroglu Alp, Santemiz Pinar, Akarun Lale, Automatic Sign Language Recognition and Applications for Turkish Sign Language (in Turkish), in Arik Engin (ed.), Koc University Press, Turkey.
Modeling Annotator Behaviors for Crowd Labeling
Kara Yunus Emre, Genc Gaye, Aran Oya, Akarun Lale, Modeling Annotator Behaviors for Crowd Labeling, in Neurocomputing.
Rapport with Virtual Agents: What do Human Social Cues and Personality Explain?, 2016
Cerekovic Aleksandra and Aran Oya and Gatica-Perez Daniel, Rapport with Virtual Agents: What do Human Social Cues and Personality Explain?, 2016, in IEEE Transactions on Affective Computing.
Small Group Analysis
Gatica-Perez Daniel, Aran Oya, Jayagopi Dinesh, Small Group Analysis, in Burgoon J. (ed.), Cambridge University Press, UK.

Collaboration

Group / person Country
Types of collaboration
Social Computing Group / Idiap Switzerland (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
- Research Infrastructure
Queen Mary Univ. of London (QMUL) Great Britain and Northern Ireland (Europe)
- Publication
Bogazici University Turkey (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
MIT Media Lab United States of America (North America)
- Publication
Delft University of Technology Netherlands (Europe)
- Publication
Bruno Kessler Foundation Italy (Europe)
- Publication
Marianne Schmid-Mast / University of Neuchatel Switzerland (Europe)
- in-depth/constructive exchanges on approaches, methods or results
- Publication
- Research Infrastructure

Scientific events

Active participation

Title Type of contribution Title of article or contribution Date Place Persons involved
ICMI 2014 workshop on Understanding and Modeling Multiparty, Multimodal Interactions Talk given at a conference Effect of nonverbal behavioral patterns on the performance of small groups 16.11.2014 Istanbul, Turkey Aran Oya; AVCI Umut;
HBU 2014 - ECCV 2014 Workshop on Human Behavior Understanding Talk given at a conference How Do You Like Your Virtual Agent?: Human-Agent Interaction Experience through Nonverbal Features and Personality Traits 12.09.2014 Zurich, Switzerland Cerekovic Aleksandra; Aran Oya;
SIU 2014 - IEEE 22nd Signal Processing and Communications Applications Conference Talk given at a conference Vision based personality analysis using transfer learning methods 23.04.2014 Trabzon, Turkey Kindiroglu Ahmet Alp; Aran Oya;
ICMI 2013 Poster Cross-domain personality prediction: from video blogs to small group meetings 11.12.2013 Sydney, Australia Aran Oya;
ICMI 2013 Talk given at a conference One of a kind: inferring personality impressions in meetings 11.12.2013 Sydney, Australia Aran Oya;
SONVB Workshop Individual talk Domain Adaptation for Personality Prediction 20.09.2012 Neuchatel, Switzerland, Switzerland Aran Oya;


Self-organised

Title Date Place
Human Behavior Understanding Workshop 22.10.2013 Barcelona, Spain

Associated projects

Number Title Start Funding scheme
127542 SONVB: Sensing and Analyzing Organizational Nonverbal Behavior 01.06.2010 Sinergia
130152 Robust face tracking, feature extraction and multimodal fusion for audio-visual speech recognition and visual attention modeling in complex environment 01.04.2010 Project funding (Div. I-III)
118632 Nonverbal communication in the medical encounter 01.05.2009 ProDoc

Abstract

In the last decade, there is a strong interest on analyzing human actions, especially in smart room applications. However, the intelligence of these systems is limited to performing given tasks, without considering the social situation in the environment or the social behavior of the people in the surrounding. Among the range of applications that provide support systems in rooms equipped with sensing devices, the computational analysis of social interaction is an emerging field of research with the aims of converting these support systems to be socially aware. The research on computational social behavior analysis and this proposed research program in particular, aims to deliver research results that would enable the development of tools to improve quality of life, by collective decision making support systems, meeting support systems and systems for self-assessment, training, and education.This project proposes to build computational models of social constructs that define the social behavior of individuals and groups in face to face conversations, perceived via audio, visual or mobile sensors. The aim is to automatically analyze the social behavior of individuals during their interaction through their nonverbal signals and build models to estimate several social concepts using machine learning techniques. The novelty of the proposed approach is that it investigates computational approaches that make use of the close relation between related social constructs, such as dominance and leadership, or personality and dominance, during the learning process. The assumption is that these social constructs are related, thus automatic inference for one concept can take advantage of the other. The project follows a joint learning approach that combines the individual characteristics of participants in a group, such as personality and mood, with their social position in the group, such as dominance or roles, resulting from intra-group interaction and relations, as well as the overall group structure. Another novelty of the proposal is related to the usage of social media content to learn social behavior of individuals. Unlike the limited amount of data that is used to build computational models of social behavior, the social media sites provide an excellent and a vast amount of data for natural human behavior. The project aims to transfer the knowledge that can be extracted from the audio-visual behavioral content in social media (i.e. video blogging sites, video discussion sites, video lectures sites, etc.) to small group settings.My research program is structured in four main phases. The first phase consists of modeling individual characteristics of participants. For this purpose, a database collected from social media will be used together with annotations on personality and mood. Techniques to transfer the knowledge gained through this database to normal small group settings will be investigated in the second phase. The third phase of the project focuses on modeling the group interaction and group structure, and the fourth phase aims at developing a joint learning approach to learn related social tasks.
-