Preisig Basil C., Eggenberger Noëmi, Cazzoli Dario, Nyffeler Thomas, Gutbrod Klemens, Annoni Jean-Marie, Meichtry Jurka R., Nef Tobias, Müri René M. (2018), Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation, in
Frontiers in Human Neuroscience, 12, 1-12.
Eggenberger Noëmi, Preisig Basil C., Schumacher Rahel, Hopfner Simone, Vanbellingen Tim, Nyffeler Thomas, Gutbrod Klemens, Annoni Jean-Marie, Bohlhalter Stephan, Cazzoli Dario, Müri René M. (2016), Comprehension of Co-Speech Gestures in Aphasic Patients: An Eye Movement Study, in
PLOS ONE, 11(1), e0146583-e0146583.
Preisig Basil, Eggenberger Noemi, Zito Giuseppe, Vanbellingen Tim, Schumacher Rahel, Hopfner Simone, Gutbrod Klemens, Nyffeler Thomas, Cazzoli Dario, Annoni Jean-Marie, Bohlhalter Stephan, Müri René M. (2016), Eye gaze behaviour at turn transition: How aphasic patients process speakers’ turns during video observation, in
Journal of Cognitive Neuroscience, 28(10), 1613-1624.
Schumacher Rahel, Cazzoli Dario, Eggenberger Noëmi, Preisig Basil, Nef Tobias, Nyffeler Thomas, Gutbrod Klemens, Annoni Jean-Marie, Müri René M. (2015), Cue Recognition and Integration – Eye Tracking Evidence of Processing Differences in Sentence Comprehension in Aphasia, in
PLOS ONE, 10(11), e0142853-e0142853.
Vanbellingen Tim, Schumacher Rahel, Eggenberger Noëmi, Hopfner Simone, Cazzoli Dario, Preisig Basil C., Bertschi Manuel, Nyffeler Thomas, Gutbrod Klemens, Bassetti Claudio L., Bohlhalter Stephan, Müri René M. (2015), Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation, in
Neuropsychologia, 71, 158-164.
Preisig Basil C., Eggenberger Noëmi, Zito Giuseppe, Vanbellingen Tim, Schumacher Rahel, Hopfner Simone, Nyffeler Thomas, Gutbrod Klemens, Annoni Jean-Marie, Bohlhalter Stephan, Müri René M. (2015), Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations, in
Cortex, 64, 157-168.
People commonly gesture while speaking and there is a great behavioral variety of such speech-associated gestures. When aphasia inhibits verbal expression, gestures may be used to compensate for this deficiency, but the compensatory use of gestures requires adaptation to the deprivation of verbal context. Therefore, co-speech gestures cannot rely on disambiguation by concurrent speech, since they have at the same time to carry an increased load of information. Healthy subjects can easily adapt their gestures, when obstacles such as the ignorance of the local language or acoustic problems prevent them from oral communication. The literature concerning aphasia and co-speech gestures is not unanimous and many questions remain open. Use aphasic patients more co-speech gestures to communicate? Are these gestures communicative or are these gestures used to facilitate speech and/or to overcome speech perseveration? Finally, is apraxia a condition that prevents the successful use of co-speech gestures for aphasic patients?In a series of experiments we will examine gesture production and gesture comprehension in patients with aphasia. We will use behavioral analysis of gesture and speech production, event-related potentials and gaze behavior analysis during gesture perception. Furthermore, symptom-lesion analysis in the aphasic patients will be performed, and apraxia will be evaluated by the test of upper limb apraxia (TULIA). Gaze analysis has been shown that eye movement analysis is also a valuable method to study conditions of competitive parallel processing in complex human behavior. In a preliminary experiment, we measured gaze behavior during video presentation of gestures to aphasic patients and healthy controls. Taking together, the results show that there are differences between aphasic patients and healthy controls in the way they look at gestures and that these differences can be successfully assessed by means of eye movement measurements. Aphasic patients seem to be especially handicapped when a gesture is meaningless, which was reflected by significantly increased mean fixation duration and cumulative fixation duration. The project is of primary importance for two reasons: Firstly, we analyze in our project not only co-speech production and language comprehension, but also perception of co-speech gesture by using gaze behavior measurements and ERP analysis. Furthermore, we expect from the lesion symptom mapping of aphasic patients additional information about the cortical organization of the network controlling co-speech gestures and language in aphasia. Thus the project will have impact for the understanding of the development and relationship of language, perception, and action. Secondly, the understanding of the role of apraxia in co-speech gesture perception and production will be important for future aphasia rehabilitation concepts.