In recent years the presence and use of interactive devices has been growing rapidly. Mobile phones are omnipresent and smart phones keep us connected to social networks at all time, cars are equipped with satellite navigation systems and can sense when we are tired, and children’s stuffed animals and dolls can ‘talk’ and express wishes. With interactive devices becoming ever more ubiquitous, the field of human-computer interaction has seen a turn from investigating business environments and how to make interaction more effective and efficient towards finding out how interaction can happen in a more joyous and satisfying way in all facets of everyday life. We find ourselves in need of measures and tools to assess these experiences. We are proposing G-DEDUCE to extend project DEDUCE, funded by SNSF since January 2009, and expand its findings while maintaining the same aim and focus.
The aim of this project is to find new tools that allow assessing users’ affective states in evaluation trials of interactive products.
In a series of user studies run by DEDUCE, we have observed children playing video games and could see how they expressed their emotional states in a non-verbal way. When we watched recordings with the sound turned off, we could still easily identify when children were happy and proud after winning a game, or disappointed and sad after loosing only from observing how they moved around.
Literature suggests that reading emotional states of the people around us comes natural to us. As we are social beings it is important to know whether our environment is friendly and at ease or tense and hostile. Non-verbal expression of emotion involves the tone of the voice, facial expressions, and expressions from the rest of the body, i.e. gesture, posture and body movements.
Here we propose to investigate and develop an automatic emotion recognition system based on posture and body movements.
We will develop a series of working prototypes for gesture capture and interpretation. We will then build a database of affective movement data, by recording movement data and coding the recordings by walkthroughs with participants and/or by observer ratings, finally we will train a classifier to automatically identify emotional states.
The originality of the study is in its attention to a crucial area of interdisciplinary research that is evaluation techniques and their use in different scenarios. The project will deliver interactive tools to be used both at evaluation time and when analysing data by providing an initial, if rough, description of user interactions with system ready for interpretation.