Perception of emotion from single sung notes: an event-related brain potential study of valence and identity matching

Goydke, K. N.1,3, Urbach, T. P.2, Kutas, M.2, Altenmueller, E.1, and Muente, T. F.3
1Institute of Music Physiology and Performing Artists Medecine, University for Music and Drama Hannover, Germany; 2Department of Cognitive Science, University of California San Diego, CA; 3Department of Neuropsychology, University of Magdeburg, Germany
E-mail: katja.goydke@hmt-hannover.de

The current study asked whether single notes, like brief presentations of faces and pictures, had affective valence, i.e., could be perceived as happy or sad. To that end, 10 participants listened to the same sequential pairs of sung tones (300 to 600 ms in length) under two task conditions. In the Valence Matching Task, participants were asked to indicate via a button press whether the emotions conveyed by the note pairs were the same or different. In the Voice Matching Task, participants were asked to indicate whether or not the singer of the first (S1) and second note (S2) was same. Though finding it difficult participants were able to perceive the affective valence of single notes (valence task 64%: ; voice task: 66%). Concurrently recorded event-related brain potentials (ERPs), aimed at assessing the time-course and spatial distribution of the categorization processes, showed that regardless of task, the ERP to S2 had a greater positivity from 200-1000 ms post tone-onset when preceded by an incongruent than a congruent sound; neither the timing nor distribution of the mismatch effects differed for voice vs. valence matching. In sum, musical units as small as single notes are suitable for expressing basic emotions, which can be decoded by the brain within 200 ms, at least when attended. This finding suggests affective information in music is carried not only by large-scale features such as tempo and rhythm but also by some features of individual notes.