Tags

Type your tag names separated by a space and hit enter

Integration of cross-modal emotional information in the human brain: an fMRI study.
Cortex. 2010 Feb; 46(2):161-9.C

Abstract

The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

Authors+Show Affiliations

Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul, Republic of Korea.No affiliation info availableNo affiliation info availableNo affiliation info availableNo affiliation info availableNo affiliation info availableNo affiliation info available

Pub Type(s)

Journal Article
Research Support, Non-U.S. Gov't

Language

eng

PubMed ID

18691703

Citation

Park, Ji-Young, et al. "Integration of Cross-modal Emotional Information in the Human Brain: an fMRI Study." Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, vol. 46, no. 2, 2010, pp. 161-9.
Park JY, Gu BM, Kang DH, et al. Integration of cross-modal emotional information in the human brain: an fMRI study. Cortex. 2010;46(2):161-9.
Park, J. Y., Gu, B. M., Kang, D. H., Shin, Y. W., Choi, C. H., Lee, J. M., & Kwon, J. S. (2010). Integration of cross-modal emotional information in the human brain: an fMRI study. Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 46(2), 161-9. https://doi.org/10.1016/j.cortex.2008.06.008
Park JY, et al. Integration of Cross-modal Emotional Information in the Human Brain: an fMRI Study. Cortex. 2010;46(2):161-9. PubMed PMID: 18691703.
* Article titles in AMA citation format should be in sentence-case
TY - JOUR T1 - Integration of cross-modal emotional information in the human brain: an fMRI study. AU - Park,Ji-Young, AU - Gu,Bon-Mi, AU - Kang,Do-Hyung, AU - Shin,Yong-Wook, AU - Choi,Chi-Hoon, AU - Lee,Jong-Min, AU - Kwon,Jun Soo, Y1 - 2008/06/29/ PY - 2008/04/19/received PY - 2008/06/01/revised PY - 2008/06/20/accepted PY - 2008/8/12/pubmed PY - 2010/4/7/medline PY - 2008/8/12/entrez SP - 161 EP - 9 JF - Cortex; a journal devoted to the study of the nervous system and behavior JO - Cortex VL - 46 IS - 2 N2 - The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration. SN - 1973-8102 UR - https://www.unboundmedicine.com/medline/citation/18691703/Integration_of_cross_modal_emotional_information_in_the_human_brain:_an_fMRI_study_ L2 - https://linkinghub.elsevier.com/retrieve/pii/S0010-9452(08)00172-X DB - PRIME DP - Unbound Medicine ER -