Emotional speech processing: Effects of gender and culture

Schirmer, A.
Max Planck Institute of Cognitive Neuroscience, Leipzig, Germany
E-mail: schirmer@cns.mpg.de

Speakers can express emotions both verbally and prosodically. Emotional prosodic expression refers to modulations of acoustic parameters such as loudness, speech melody and tempo and is known to help listeners to interpret the emotional message conveyed verbally. Moreover, recent ERP findings suggest that emotional prosody can establish contextual information that influences word processing in a similar way as semantic context. Words that are incongruous with emotional prosody (e.g., positive words spoken with a negative prosody) elicit a larger N400 as compared to congruous words (e.g., negative words spoken with a negative prosody). Interestingly, however, this emotional-prosodic context effect depends on experimental variables such as the time lag between prosodic information and the onset of the critical word, the instruction to ignore or to attend to prosody, and the listeners sex and cultural background. My talk will focus on the effects of these variables as revealed by a series of ERP studies that investigated emotional speech processing in German and Cantonese listeners.