Klinische Neurophysiologie 2008; 39 - A168
DOI: 10.1055/s-2008-1072970

Impaired emotional face processing in patients with multiple sclerosis: an ERP analysis of the N170 component

M Adamaszek 1, M Weymar 2, J Berneiser 3, A Dressel 3, C Kessler 3, A Hamm 2
  • 1Universität, Klinik für Psychiatrie, Leipzig
  • 2Universität, Psychologie, Greifswald
  • 3Universität, Klinik für Neurologie, Greifswald

Background: Besides variable neuropsychological disturbances, patients with a multiple sclerosis (MS) show impairments of discriminating emotional facial expressions. In ERP studies, the processing of faces is related to the N170, a negative potential peaking around at 170ms and typically identified over temporo-occipital sensor sites. The neural sources of the face-sensitive N170 have repeatedly been localized in the gyrus fusiformis with additional activations in temporal, parietal and occipital brain structures. While lesions of the right temporal and parietal cortex are essential in deficits of recognizing affective facial expressions, less is known about their influence on earlier stages of face processing (structural encoding), as reflected by the N170.

Methods: 26 out-patients with a definite MS and 11 control subjects participated in this study. The MS patients were examined for deficits of emotional face recognition using the subtests 1–5 of the Tübinger Affect Battery (TAB) and were then divided into MS patients with high clinical performance (MS II) and MS patients with low clinical performance (MS I). For studying the N170, we presented 60 paired faces according to subtest 2 (affect discrimination) of the Tübinger Affect Battery (TAB) in one trial, and 60 faces each with a following decision slice of four different faces according to subtest 5 (affect matching) of the TAB in a second trial. In an additional run, subjects viewed 180 photographs of faces, animals and neutral objects in order to evaluate if the discrimation of faces was intact. All faces were taken from the Karolinska Directed Emotional Faces (KDEF) and consisted of neutral, angry, happy, sad, and fearful facial expressions. Images of animals and neutral objects were selected from the International Affective Picture System (IAPS). ERPs elicited by faces and control stimuli were measured using a dense sensor EEG (129 channels) and analyzed with EMEGS, a MATLAB based software package. Clinical and electrophysiological data were analyzed by repeated measures ANOVAs.

Results: All participants showed an augmented N170 to faces compared to other objects (animals, household). While all three groups showed a comparable N170 for paired faces with a congruent or an incongruent emotional expression over occipito-temporal areas in subtest 2, patients with an emotion recognition deficit in the TAB (MS I) showed a reduced right-lateralized N170 amplitude in all five conditions (neutral, angry, happy, sad or fearful faces) in subtest 5. Moreover, patients with a low emotion recognition accuracy revealed an increased reaction time for incorrect responses. No correlations were found between neurophysiological data and lesion load in MRI or clinical scores (MSFC).

Conclusions: The impaired N170 in MS patients with clinical deficits in recognizing emotional facial expressions in the affect matching task may contribute an interesting neurophysiological correlate of disturbed explicit social cognition already on the early stage of structural encoding within the temporo-occipital network.