Visual speech input drives auditory cortex

We kind of already knew this.  For example, early imaging work by Calvert and Co. showed that portions of auditory cortex activate during lip reading and electromagnetic recordings by van Wassenhove et al. showed that adding visual speech information to an auditory speech signal speeds up the N1 component of the cortical response to speech sounds.  But there was still a little wiggle room in explaining these effects.  Yes, lip reading activates auditory cortex but maybe this is just auditory imagery and with group averaged activations how do you know where you are exactly?  And yes, the N1 is an early component but it likely reflects activity in a range of auditory regions, perhaps even including STS, so how early is it in terms of the cortical processing hierarchy? So we (Okada et al.) decided to try to nail this issue down with the following fMRI experiment: we localized auditory cortex in two different ways in individual subjects, using an AM noise localizer scan and using an anatomically defined mask covering Heschl's gyrus.  We then measured the activity in the ROIs (left and right hemis) while listening to auditory speech only and while listening to auditory+visual speech. We found that the addition of the visual speech signal significantly up-regulated the activity of auditory cortex compared to auditory speech alone. It's confirmed then: visual speech information drives auditory processing at the first stages of the cortical process...
Source: Talking Brains - Category: Neurologists Authors: Source Type: blogs