Marstaller, Lars and Burianova, Hana 2014. The multisensory perception of co-speech gestures - A review and meta-analysis of neuroimaging studies. Journal of Neurolinguistics 30 , pp. 69-77. 10.1016/j.jneuroling.2014.04.003 |
Abstract
Co-speech gestures constitute a unique form of multimodal communication because here the hand movements are temporally synchronized and semantically integrated with speech. Recent neuroimaging studies indicate that the perception of co-speech gestures might engage a core set of frontal, temporal, and parietal areas. However, no study has compared the neural processes during perception of different types of co-speech gestures, such as beat, deictic, iconic, and metaphoric co-speech gestures. The purpose of this study was to review the existing literature on the neural correlates of co-speech gesture perception and to test whether different types of co-speech gestures elicit a common pattern of brain activity in the listener. To this purpose, we conducted a meta-analysis of neuroimaging studies, which used different types of co-speech gestures to investigate the perception of multimodal (co-speech gestures) in contrast to unimodal (speech or gestures) stimuli. The results show that co-speech gesture perception consistently engages temporal regions related to auditory and movement perception as well as frontal-parietal regions associated with action understanding. The results of this study suggest that brain regions involved in multisensory processing and action understanding constitute the general core of co-speech gesture perception.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Psychology |
Publisher: | Elsevier |
ISSN: | 0911-6044 |
Date of Acceptance: | 7 April 2014 |
Last Modified: | 18 Feb 2019 14:16 |
URI: | https://orca.cardiff.ac.uk/id/eprint/95290 |
Citation Data
Cited 17 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
![]() |
Edit Item |