Quantcast

Device captures ‘moving’ sound

A conversation snaps around you. A string trio plays in front of you as you turn your attention from cello to violin. A soprano walks across the stage, and you turn your head to follow her voice. Conventional sound recording cannot capture those experiences — but an invention by audio engineers at the University of California, Davis is changing that.

From UC Davis:
Sound With Space and Motion

A conversation snaps around you. A string trio plays in front of you as you turn your attention from cello to violin. A soprano walks across the stage, and you turn your head to follow her voice. Conventional sound recording cannot capture those experiences — but an invention by audio engineers at the University of California, Davis is changing that.

Motion-tracked binaural sound (MTB) captures cues for direction, distance and movement and the subtleties of natural, ambient sound that other systems don’t. Developed by Ralph Algazi, Richard Duda and Dennis Thompson at the Interface Laboratory in the UC Davis Center for Image Processing and Integrated Computing (CIPIC), the patent-pending technique uses off-the-shelf equipment that won’t break the bank.

For the listener, it’s an eerie experience of listening to invisible people talking, playing music or singing around you while you can turn your attention from one to another.

”Conventional audio playback doesn’t reflect how you hear in real life,” Algazi said. ”Your body, the shape of your head and the room acoustics all affect how you hear.”

Conventional binaural recording uses microphones embedded in a dummy head to record sound, so that playback through headphones mimics the recording. That method does a fair job of reproducing sounds to the left and right but not to the front, where sounds appear to come from immediately in front of or behind your head. Furthermore, it doesn’t allow for movement, Duda said.

”The problem is that people do not keep still,” Duda said.

The new method records through multiple microphones (eight for voice, 16 for music) spaced around a head-sized ball or cylinder. The sound is played back through headphones with a small tracking device attached to the top to follow head movements. As you turn your head while listening, the system mixes sound from different microphones, reproducing what you would hear if you were in the room.

Listeners can move their heads to locate a sound source, turn to ”face” a person speaking and tell when sound sources are nearby or far away. In addition, MTB captures the ambient sounds of the location, so you recognize the echoes in a church or the confines of a conference room.

The engineers have made sample recordings with musicians from the UC Davis music department and visiting classical and bluegrass musicians.

”The system can capture the sound of instruments much more fully than a conventional single microphone. It captures changes in sound and the effects of a room in a way that is much closer to reality,” Algazi said.

”I think they have something really wonderful,” said Pablo Ortiz, chair of the Department of Music at UC Davis.

”The thing that’s interesting to me is the way that it records space,” Ortiz said.

William Beck, a composer of electronic music and lecturer in the music department, said that live recordings could be a major application.

”The ‘being there’ feel is something people would really like,” Beck said.

The technique could also be used for teleconferencing, computer games and virtual- or augmented-reality systems. The work, which is supported by grants from the National Science Foundation, was presented at the 116th Convention of the Audio Engineering Society held in May 2004 in Berlin, Germany.

The CIPIC Interface Laboratory is a multidisciplinary research center that studies human perception and its role in the interface between humans and machines.




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.