Scientists find brain responses to sentence structure differ for speaking and listening
MedicalXpress Breaking News-and-Events Mar 09, 2024
How does the brain respond to sentence structure as we speak and listen? In a neuroimaging study published in PNAS, researchers from the Max Planck Institute for Psycholinguistics (MPI) and Radboud University in Nijmegen investigated sentence processing during spontaneous speech for the first time.
While we speak, brain activity increases early on in sentences, anticipating structure building. In contrast, during listening, brain activity increased at the end of phrases, reflecting the integration of sentence structure.
Both speaking and listening involve combining words in a sentence following grammatical rules. However, the precise timing of this 'syntactic processing' remains unclear. Sentence production is studied less often than sentence comprehension. Moreover, researchers usually study sentence production with complex tasks that are very different from speaking in natural situations.
"Syntactic processing allows us to combine words to create new meanings," says senior researcher Peter Hagoort, director of the Donders Institute for Brain, Cognition and Behavior. "We investigated brain responses to spontaneous speech, to better understand how the brain does it and how this process differs when we speak versus when we listen."
Watching TV in a scanner
The researchers decided to compare brain responses to syntactic processing during spontaneous speaking and listening. Native English speakers watched an episode of the BBC series 'Sherlock' in an MRI scanner. Next, they were asked to recall what happened in their own words. Other participants then listened to one of the participants speaking about the episode ("So, they began with like a dream sequence of a shootout"). This enabled the team to compare brain activity during speaking and listening to the same sentences.
The team extracted the syntactic structure for each spoken sentence, and modeled how many syntactic operations had to be done at each word to build the structure for the sentence. They then asked which brain areas were sensitive to these syntactic operations during speaking and listening.
Early or late activation
During speaking, brain areas associated with syntactic processing showed increased activation early on in the sentences. This indicates that while we speak, we build sentence structure incrementally or word-by-word, in anticipation of what comes next. In contrast, during listening, brain activity increased towards the end of phrases—groups of words that function as grammatical units. To understand a sentence, participants in this study were more likely to adopt a 'wait-and-see' approach, to successfully integrate all the available information in a structure.
"This study brings us closer to understanding the similarities and differences between speaking and listening and how these everyday functions are implemented in the brain," says first author Laura Giglio.
"It is feasible to study spontaneous speech and much can be learned from it. Future research can take advantage of this study to better model brain responses to linguistic processing and to better describe the complex relationship between speaking and listening."
-
Exclusive Write-ups & Webinars by KOLs
-
Daily Quiz by specialty
-
Paid Market Research Surveys
-
Case discussions, News & Journals' summaries