Quantcast

Tactile input affects what we hear: UBC study

Humans use their whole bodies, not just their ears, to understand speech, according to University of British Columbia linguistics research.

It is well known that humans naturally process facial expression along with what is being heard to fully understand what is being communicated. The UBC study is the first to show we also naturally process tactile information to perceive sounds of speech.

Prof. Bryan Gick of UBC’s Dept. of Linguistics, along with PhD student Donald Derrick, found that air puffs directed at skin can bias perception of spoken syllables. “This study suggests we are much better at using tactile information than was previously thought,” says Gick, also a member of Haskins Laboratories, an affiliate of Yale University.

The study, published in Nature today, offers findings that may be applied to telecommunications, speech science and hearing aid technology.

English speakers use aspiration — the tiny bursts of breath accompanying speech sounds — to distinguish sounds such as “pa” and “ta” from unaspirated sounds such as “ba” and “da.” Study participants heard eight repetitions of these four syllables while inaudible air puffs — simulating aspiration ? were directed at the back of the hand or the neck.

When the subjects ? 66 men and women ? were asked to distinguish the syllables, it was found that syllables heard simultaneously with air puffs were more likely to be perceived as aspirated, causing the subjects to mishear “ba” as the aspirated “pa” and “da” as the aspirated “ta.” The brain associated the air puffs felt on skin with aspirated syllables, interfering with perception of what was actually heard.

It is unlikely aspirations are felt on the skin, say the researchers. The phenomenon is more likely analogous to lip-reading where the brain’s auditory cortex area activates when the eyes see lips move, signaling speech. From the brain’s point of view, you are “hearing” with your eyes.

“Our study shows we can do the same with our skin, “hearing” a puff of air, regardless of whether it got to our brains through our ears or our skin,” says Gick.

Future research may include studies of how audio, visual and tactile information interact to form the basis of a new multi-sensory speech perception paradigm. Additional studies may examine how many kinds of speech sounds are affected by air flow, offering important information about how people interact with their physical environment.




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.