Dynamic EEG analysis during language comprehension reveals interactive cascades between perceptual processing and sentential expectations.

Dynamic EEG analysis during language comprehension reveals interactive cascades between perceptual processing and sentential expectations. Brain Lang. 2020 Oct 18;211:104875 Authors: Sarrett ME, McMurray B, Kapnoula EC Abstract Understanding spoken language requires analysis of the rapidly unfolding speech signal at multiple levels: acoustic, phonological, and semantic. However, there is not yet a comprehensive picture of how these levels relate. We recorded electroencephalography (EEG) while listeners (N = 31) heard sentences in which we manipulated acoustic ambiguity (e.g., a bees/peas continuum) and sentential expectations (e.g., Honey is made by bees). EEG was analyzed with a mixed effects model over time to quantify how language processing cascades proceed on a millisecond-by-millisecond basis. Our results indicate: (1) perceptual processing and memory for fine-grained acoustics is preserved in brain activity for up to 900 msec; (2) contextual analysis begins early and is graded with respect to the acoustic signal; and (3) top-down predictions influence perceptual processing in some cases, however, these predictions are available simultaneously with the veridical signal. These mechanistic insights provide a basis for a better understanding of the cortical language network. PMID: 33086178 [PubMed - as supplied by publisher]
Source: Brain and Language - Category: Neurology Authors: Tags: Brain Lang Source Type: research
More News: Brain | Honey | Neurology | Peas | Science