New research from Northeastern professor of psychology Iris Berent and her colleagues indicates that language and motor systems are intricately linked—though not in the way that has been widely believed.
Spoken languages express words by sound patterns, some of which are preferred to others. For instance, the sound pattern “blog” is preferred to “lbog” in English as well as many other languages. The researchers wanted to know what accounts for such preferences—specifically, whether they reflect abstract rules of language in the brain, or if upon hearing speech people attempt to simulate how those sounds are produced by the speech motor system.
Their findings support previous research indicating the connection between people’s knowledge of language and the motor system; however, that connection is different than what has been previously assumed. The motor system doesn’t drive linguistic preference directly, they found. Rather, abstract rules of language guide linguistic preference, and these abstract rules can trigger motor action. In other words, motor action is a consequence of—not the cause of—linguistic preference.
Sound patterns like “blog” are preferred over those like “lbog” not because they are easy to produce; rather, these syllables are preferred because they conform to linguistic rules, and consequently they tend to activate the motor system, she said.
What’s more, Berent said these findings could have implications in studying language-related disorders that are linked to the motor system. One of those areas is dyslexia, which Berent has been studying for years.
“This has huge theoretical implications,” said Berent, a cognitive scientist whose research examines the nature of linguistic competence. “The idea that linguistic knowledge is fully embodied in motor action is a hot topic in neuroscience right now. Our study shows that motor action is still very important in language processing, but we show a new twist on the mind-body connection.”
The research was published Monday afternoon in the journal Proceedings of the National Academy of Sciences. Among Berent’s collaborators was Alvaro Pascual-Leone, an internationally renowned neurologist at Beth Israel Deaconess Medical Center in Boston and Harvard Medical School and whose expertise in transcranial magnetic stimulation, or TMS, played a key role in the research. Xu Zhao, PhD’15, a doctoral student in Northeastern’s Department of Psychology, and other researchers affiliated with the Beth-Israel Deaconess Medical Center, Harvard Medical School, Brigham and Women’s Hospital, and University of Oxford co-authored the paper.
Albert Galaburda, a co-author on the paper and a preeminent neurologist at BIDMC, said, “This study helps to solve a longstanding debate in the literature: What part of speech depends on experience and what part depends on relatively experience-independent grammatical rules, or some kind of logic system? Since my primary interest is in language-based learning disorders, particularly dyslexia, this question can be transformed to ask whether dyslexics have a primary disorder of grammar, or a primary disorder of the motor system or the poor perception of speech reaching their ears when babies.”
The researchers’ findings are based on a study in which they sought to gauge the sensitivity of English-speaking adults to syllable structure. Across languages, syllables like “blif” are more common than “lbif,” and past research from Berent’s lab found that syllables like “blif” are easier to process, suggesting that these syllables are preferred. The researchers sought to discover the reason for this preference: do ill-formed syllables like “lbif” violate abstract rules, or do people have difficulty in their processing because these syllables are hard to produce?
To examine this question, the researchers used TMS, a noninvasive technique that induces focal cortical current via electro-magnetic induction to temporarily inhibit specific brain regions. The goal was to find out if disrupting participants’ lip motor regions using TMS would eliminate the preference for “blif.”
In the experiment, participants were presented with an auditory stimulus—either a monosyllable or disyllable, for example “blif” or “belif”—and asked to indicate if that stimulus included one or two syllables. Two hundred milliseconds before hearing that sound, TMS pulses were administered to temporarily disrupt the lip motor region. The critical comparison concerned well-formed syllables (e.g., “blif”) vs. ill-formed ones (e.g., “lbif”). The researchers asked whether the disruption of the motor system would disrupt the disadvantage of “lbif.” If people dislike “lbif” because this pattern is difficult to articulate, then syllables like “lbif” should be more susceptible to TMS, and therefore once people receive the TMS, their dislike for “lbif” should be lessened.
They found that TMS pulses did impair participants’ ability to accurately determine the number of syllables. However, the results flew in the face of the embodiment motor hypothesis. The researchers found that ill-formed syllables like “lbif” were least likely to be impaired by TMS, and a subsequent functional MRI experiment found that these syllables were also least likely to engage the lip motor area in the brain.
The results show that speech perception automatically engages the articulatory motor system, but linguistic preferences persist even when the language motor system is disrupted. These findings suggest that, despite their intimate links, the language and motor systems are distinct.
“Language is designed to optimize motor action, but its knowledge consists of principles that are disembodied and potentially abstract,” the researchers concluded. The press release came from Northeastern University.