Quantcast

How your emotional state affects how you hear speech

I found an interesting study by Wang et. al investigating how the current emotional state that we find ourselves in modulates the auditory response of speech early in the sensory processing stream at the cortical level. Here’s their abstract.

In order to understand how emotional state influences the listener’s physiological response to speech, subjects looked at emotion-evoking pictures while 32-channel EEG evoked responses (ERPs) to an unchanging auditory stimulus (“danny”) were collected. The pictures were selected from the International Affective Picture System database. They were rated by participants and differed in valence (positive, negative, neutral), but not in dominance and arousal. Effects of viewing negative emotion pictures were seen as early as 20 msec (p = .006). An analysis of the global field power highlighted a time period of interest (30.4–129.0 msec) where the effects of emotion are likely to be the most robust. At the cortical level, the responses differed significantly depending on the valence ratings the subjects provided for the visual stimuli, which divided them into the high valence intensity group and the low valence intensity group. The high valence intensity group exhibited a clear divergent bivalent effect of emotion (ERPs at Cz during viewing neutral pictures subtracted from ERPs during viewing positive or negative pictures) in the time period of interest (r? = .534, p < .01). Moreover, group differences emerged in the pattern of global activation during this time period. Although both groups demonstrated a significant effect of emotion (ANOVA, p = .004 and .006, low valence intensity and high valence intensity, respectively), the high valence intensity group exhibited a much larger effect. Whereas the low valence intensity group exhibited its smaller effect predominantly in frontal areas, the larger effect in the high valence intensity group was found globally, especially in the left temporal areas, with the largest divergent bivalent effects (ANOVA, p < .00001) in high valence intensity subjects around the midline. Thus, divergent bivalent effects were observed between 30 and 130 msec, and were dependent on the subject’s subjective state, whereas the effects at 20 msec were evident only for negative emotion, independent of the subject’s behavioral responses. Taken together, it appears that emotion can affect auditory function early in the sensory processing stream.

Wang J, Nicol T, Skoe E, Sams M, & Kraus N (2009). Emotion modulates early auditory response to speech. Journal of cognitive neuroscience, 21 (11), 2121-8 PMID: 18855553

TheQuantumLobeChronicles.blogspot.com




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.