Emotions amplify speaker-listener neural alignment.
Dmitry SmirnovHeini SaarimäkiEnrico GlereanRiitta HariMikko SamsLauri NummenmaaPublished in: Human brain mapping (2019)
Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two "speaker" subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 "listener" subjects during fMRI. After scanning, both speakers and listeners rated the moment-to-moment valence and arousal of the stories. Time-varying similarity of the blood-oxygenation-level-dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker-listener pairs. Telling and listening to the stories elicited similar emotions across speaker-listener pairs. Arousal was associated with increased speaker-listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker-listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker-listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion-dependent speaker-listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.