Speech-accompanying gestures are not processed by the language-processing mechanisms

Publication date: Available online 2 July 2019Source: NeuropsychologiaAuthor(s): Olessia Jouravlev, David Zheng, Zuzanna Balewski, Alvince Le Arnz Pongos, Zena Levan, Susan Goldin-Meadow, Evelina FedorenkoAbstractSpeech-accompanying gestures constitute one information channel during communication. Some have argued that processing gestures engages the brain regions that support language comprehension. However, studies that have been used as evidence for shared mechanisms suffer from one or more of the following limitations: they a) have not directly compared activations for gesture and language processing in the same study and relied on the fallacious reverse inference (Poldrack, 2006) for interpretation, b) relied on traditional group analyses, which are bound to overestimate overlap (e.g., Nieto-Castañon and Fedorenko et al., 2012), c) failed to directly compare the magnitudes of response (e.g., Chen et al., 2017), and d) focused on gestures that may have activated the corresponding linguistic representations (e.g., “emblems”). To circumvent these limitations, we used fMRI to examine responses to gesture processing in language regions defined functionally in individual participants (e.g., Fedorenko et al., 2010), including directly comparing effect sizes, and covering a broad range of spontaneously generated co-speech gestures. Whenever speech was present, language regions responded robustly (and to a similar degree regardless of whether the video contained gestures or ...
Source: Neuropsychologia - Category: Neurology Source Type: research
More News: Brain | Neurology | Study