Skip to content
ScienceBlog.com
  • Featured Blogs
    • EU Horizon Blog
    • ESA Tracker
    • Experimental Frontiers
    • Josh Mitteldorf’s Aging Matters
    • Dr. Lu Zhang’s Gondwanaland
    • NeuroEdge
    • NIAAA
    • SciChi
    • The Poetry of Science
    • Wild Science
  • Topics
    • Brain & Behavior
    • Earth, Energy & Environment
    • Health
    • Life & Non-humans
    • Physics & Mathematics
    • Social Sciences
    • Space
    • Technology
  • Our Substack
  • Follow Us!
    • Bluesky
    • Threads
    • FaceBook
    • Google News
    • Twitter/X
  • Contribute/Contact

Auditory perception

A University of Washington team has developed an artificial intelligence system that lets a user wearing headphones look at a person speaking for three to five seconds and then hear just the enrolled speaker’s voice in real time even as the listener moves around in noisy places and no longer faces the speaker. Pictured is a prototype of the headphone system: binaural microphones attached to off-the-shelf noise canceling headphones.

AI headphones let wearer listen to a single person in a crowd, by looking at them just once

Substack subscription form sign up

Comments

  • James on Global warming reduces available wind energy
  • James on Global warming reduces available wind energy
  • Booklet AI on Key to online education: Test early and often
  • Karoly Mirnics on Common Prescription Drugs May Disrupt Cholesterol Pathways in the Womb and Raise Autism Risk
  • Aizen on Laziness helped lead to extinction of Homo erectus
© 2026 ScienceBlog.com | Follow our RSS / XML feed