TY - JOUR
T1 - The N400 and late occipital positivity in processing dynamic facial expressions with natural emotional voice
AU - Mori, Kazuma
AU - Tanaka, Akihiro
AU - Kawabata, Hideaki
AU - Arao, Hiroshi
N1 - Publisher Copyright:
© 2021 Cambridge University Press. All rights reserved.
PY - 2021
Y1 - 2021
N2 - People require multimodal emotional interactions to live in a social environment. Several studies using dynamic facial expressions and emotional voices have reported that multimodal emotional incongruency evokes an early sensory component of event-related potentials (ERPs), while others have found a late cognitive component. The integration mechanism of two different results remains unclear. We speculate that it is semantic analysis in a multimodal integration framework that evokes the late ERP component. An electrophysiological experiment was conducted using emotionally congruent or incongruent dynamic faces and natural voices to promote semantic analysis. To investigate the top-down modulation of the ERP component, attention was manipulated via two tasks that directed participants to attend to facial versus vocal expressions. Our results revealed interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N400 ERP amplitudes but not N1 and P2 amplitudes, for incongruent emotional face-voice combinations only in the face-attentive task. A late occipital positive potential amplitude emerged only during the voice-attentive task. Overall, these findings support the idea that semantic analysis is a key factor in evoking the late cognitive component. The task effect for these ERPs suggests that top-down attention alters not only the amplitude of ERP but also the ERP component per se. Our results implicate a principle of emotional face-voice processing in the brain that may underlie complex audiovisual interactions in everyday communication.
AB - People require multimodal emotional interactions to live in a social environment. Several studies using dynamic facial expressions and emotional voices have reported that multimodal emotional incongruency evokes an early sensory component of event-related potentials (ERPs), while others have found a late cognitive component. The integration mechanism of two different results remains unclear. We speculate that it is semantic analysis in a multimodal integration framework that evokes the late ERP component. An electrophysiological experiment was conducted using emotionally congruent or incongruent dynamic faces and natural voices to promote semantic analysis. To investigate the top-down modulation of the ERP component, attention was manipulated via two tasks that directed participants to attend to facial versus vocal expressions. Our results revealed interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N400 ERP amplitudes but not N1 and P2 amplitudes, for incongruent emotional face-voice combinations only in the face-attentive task. A late occipital positive potential amplitude emerged only during the voice-attentive task. Overall, these findings support the idea that semantic analysis is a key factor in evoking the late cognitive component. The task effect for these ERPs suggests that top-down attention alters not only the amplitude of ERP but also the ERP component per se. Our results implicate a principle of emotional face-voice processing in the brain that may underlie complex audiovisual interactions in everyday communication.
KW - N400 amplitudes
KW - audiovisual interactions
KW - emotion
KW - late occipital positivity
KW - multimodal integration framework
UR - http://www.scopus.com/inward/record.url?scp=85107608880&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107608880&partnerID=8YFLogxK
U2 - 10.1097/WNR.0000000000001669
DO - 10.1097/WNR.0000000000001669
M3 - Article
C2 - 34029292
AN - SCOPUS:85107608880
SN - 0959-4965
SP - 858
EP - 863
JO - NeuroReport
JF - NeuroReport
ER -