Researchers at The University of Texas at Dallas have received three grants from the National Institute on Deafness and Other Communication Disorders aimed at treating a variety of speech disorders.
The multidisciplinary effort will focus primarily on improving speech communication, predicting verbal deficits in patients with amyotrophic lateral sclerosis and improving diagnostic testing for speech disorders in children.
Dr. Thomas Campbell, the Ludwig A. Michael, MD, Callier Center Executive Director and Sara T. Martineau Professor, is an investigator on all three projects.
The projects will be through the UT Dallas’ Communication Technology Center (CTech), which fosters interdisciplinary collaboration and research and serves as an incubator to technology projects that focus on communication disorders.
“CTech is different from other programs because we develop the technologies to study communication and also because we combine the efforts of several specialties that could not accomplish these projects on their own,” Campbell said. “We are truly interdisciplinary in our approach to developing communication technology.”
CTech is made up of researchers from the School of Behavioral and Brain Sciences, the Erik Jonsson School of Engineering and Computer Science and the School of Arts and Humanities.
Campbell and his colleagues received a small-business grant to develop software that uses the data from electromagnetic articulography to create a computer-generated representation of a person’s tongue movements. Electromagnetic articulography detects the movements of oral sensors in an electrostatic field. The display would provide patients with a real-time view of how they are moving their tongues when they talk, which could help them to improve their speech.
Dr. William Katz, co-investigator and professor at the Callier Center for Communication Disorders, said that healthy speakers know how to control their tongues to make the right sounds. But people with apraxia of speech can have trouble with this process. They typically know what they want to say but have difficulty making their muscles perform correctly, causing sounds to come out wrong.
“It’d be like trying to grab for a cup without being able to see your hand. All you know is you didn’t get it,” Katz said about speech training without visual feedback. “Our approach is to give the patient additional real-time information about tongue movement during speech. The goal is to improve their tongue positioning behavior through self-correction and practice.”
A second grant will use the same electromagnetic articulography to predict the progression of ALS, also known as Lou Gehrig’s disease. ALS is a neurodegenerative disease that leads to paralysis and death.
As ALS progresses, the motor neurons enabling speech eventually die, thus robbing the patient of the ability to speak. Using the electromagnetic articulography, researchers hope to improve their ability to reliably monitor changes in the tongue movements in ALS patients. This information may allow future software to recognize intended words and create an avenue for continued communication.