Researchers unlock the ‘sound of learning’ by linking sensory and motor systems

Learning to talk also changes the way speech sounds are heard, according to a new study published in Proceedings of the National Academy of Sciences by scientists at Haskins Laboratories, a Yale-affiliated research laboratory. The findings could have a major impact on improving speech disorders.

“We’ve found that learning is a two-way street; motor function affects sensory processing and vice-versa,” said David J. Ostry, a senior scientist at Haskins Laboratories and professor of psychology at McGill University. “Our results suggest that learning to talk makes it easier to understand the speech of others.”

As a child learns to talk, or an adult learns a new language, Ostry explained, a growing mastery of oral fluency is matched by an increase in the ability to distinguish different speech sounds. While these abilities may develop in isolation, it is possible that learning to talk also changes the way we hear speech sounds.

Ostry and co-author Sazzad M. Nasir tested the notion that speech motor learning alters auditory perceptual processing by evaluating how speakers hear speech sounds following motor learning. They simulated speech learning by using a robotic device, which introduced a subtle change in the movement path of the jaw during speech.

To assess speech perception, the participants listened to words one at a time that were taken from a computer-produced continuum between the words “had” and “head.” In the speech learning phase of the study, the robot caused the jaw to move in a slightly unusual fashion. The learning is measured by assessing the extent to which participants correct for the unusual movement.

“Its like being handed a two-pound weight for the first time and being asked to make a movement, it’s uncomfortable at first, but after a while, the movement becomes natural,” said Ostry. “In growing children, the nervous system has to adjust to moving vocal tract structures that are changing in size and weight in order to produce the same words. Participants in our study are learning to return the movement to normal in spite of these changes. Eventually our work could have an impact on deviations to speech caused by disorders such as stroke and Parkinson’s disease.”

“Our study showed that speech motor learning altered the perception of these speech sounds. After motor learning, the participants heard the words differently than those in the control group,” said Ostry. “One of the striking findings is that the more motor learning we observed, the more their speech perceptual function changed.”

Ostry said that future research will focus on the notion that sensory remediation may be a way to jumpstart the motor system.

The team previously found that the movement of facial muscles around the mouth plays an important role not only in the way the sounds of speech are made, but also in the way they are heard.

Haskins Laboratories was founded in 1935 by the late Dr. Caryl P. Haskins. This independent research institute has been in New Haven, Connecticut since 1970 when it formalized affiliations with Yale University and the University of Connecticut. The Laboratories’ primary research focus is on the science of the spoken and written word.

Citation: PNAS: Early Edition November 2, 2009 0doi 10.1073/pnas. 907032106


Substack subscription form sign up