Quantcast

Poet develops ‘seeing machine’

An MIT poet has developed a small, relatively inexpensive “seeing machine” that can allow people who are blind, or visually challenged like her, to access the Internet, view the face of a friend, “previsit” unfamiliar buildings and more.

Recently the machine received positive feedback from 10 visually challenged people with a range of causes for their vision loss who tested it in a pilot clinical trial. The work was reported in Optometry, the Journal of the American Optometric Association, earlier this year.

The work is led by Elizabeth Goldring, a senior fellow at MIT’s Center for Advanced Visual Studies. She developed the machine over the last 10 years, in collaboration with more than 30 MIT students and some of her personal eye doctors. The new device costs about $4,000, low compared to the $100,000 price tag of its inspiration, a machine Goldring discovered through her eye doctor.

Goldring’s adventures at the intersection of art and high technology began with a visit to her doctor, Lloyd Aiello, head of the Beetham Eye Institute of the Joslin Diabetes Center. At the time, Goldring was blind. (Surgeries have since restored vision in one eye).

To better examine her eyes, Aiello asked her to go to the Schepens Eye Research Institute at Harvard, where technicians peered into her eyes with a diagnostic device known as a scanning laser opthalmoscope, or SLO. With the machine they projected a simple image directly onto the retina of one eye, past the hemorrhages within the eye that contributed to her blindness. The idea was to determine whether she had any healthy retina left.

It turns out that she did, and was able to see the image — a stick figure of a turtle. But the turtle wasn’t very interesting, Goldring said. So she asked if they could write the word “sun” and transmit that through the SLO. “And I could see it!” she said. “That was the first time in several months that I’d seen a word, and for a poet that’s an incredible feeling.”

She went on to use the device for many other visual experiences. For example, she developed a “visual language” consisting of short words that incorporate graphics and symbols that convey the meaning of words and make them easier to see and read.

But although the SLO held promise as more than a diagnostic device, it had serious drawbacks. In addition to the prohibitive cost, the SLO is large and bulky. Goldring determined to develop a more practical machine for the broader blind public.

She did so by collaborating over the past several years with Rob Webb, the machine’s inventor and a senior scientist at the Schepens Eye Research Institute; Aiello; Dr. Jerry Cavallerano, an optometrist at Joslin; William Mitchell, former dean of MIT’s School of Architecture and Planning and now a professor in the Program in Media Arts and Sciences; the late Steve Benton, an acclaimed optical physicist and MIT professor; and former MIT affiliate James Cain.

She has also worked with dozens of MIT graduate students and undergraduates, including Sylvia Gonzalez (S.B. 2003) and Shima Rayej (S.B. 2004), who helped design and construct the seeing machine.

“We essentially made the new machine from scratch,” Goldring said. While still allowing the projection of images, video and more onto a person’s retina, the new desktop device costs much less than its predecessor in part because it doesn’t include the diagnostic feedback of the SLO. The new seeing machine also replaces the laser of the SLO with light-emitting diodes, another source of high-intensity light that is much cheaper. Like its inspiration, the seeing machine is designed to be used by one eye.

The pilot clinical trial of the seeing machine involved visually impaired people recruited from the Beetham Eye Institute. All participants had a visual acuity of 20/70 or less in the better-seeing eye. A person with 20/70 vision can see nothing smaller than the third line from the top of most eye charts. Most participants, however, had vision that was considered legally blind, meaning they could see nothing smaller than the “big E” on a standard eye chart.

With her weak eye, Goldring can distinguish between light and dark and she can see hand movement, although not individual fingers. She cannot recognize faces or read.

Subjects “had a wide range of cause for vision loss, including diabetic retinopathy, macular degeneration (the fastest growing cause of blindness), and visual field loss,” said Cavallerano, a coauthor of the paper and another of Goldring’s doctors.

Participants used the machine to view 10 examples of Goldring’s visual language. A majority — six — interpreted all 10 “word-images” correctly. “They responded really well to the visual language,” Goldring said. “One woman told me she would love to see recipes written that way.”

They also used the machine to navigate through a virtual environment, raising the potential for “previewing” unfamiliar buildings a person wants to visit.

Goldring explained that visually challenged people are often terrified of going to new places. “There’s a fear of missing simple visual cues, steps and not being able to decipher elevator buttons.” (She noted that less than 10 percent of the blind read Braille.) Further, bystanders who aim to help — “there are five steps there; it’s the third door on the left” — are often wrong, especially people with good vision, Goldring said. “If you are visually challenged, if you see something once using the machine, you remember.”

Participants explored the virtual environment — which represented the inside of an MIT building — via a joystick that allowed them to move forward, backward and sideways.

All of the participants reported that the machine “may have the potential to assist their mobility in unfamiliar environments,” according to the Optometry article. Concluded Goldring: “A couple of them said they’d tried every seeing aid available (magnifying devices, etc.), and this was by far the best, even in this rough, rough shape.”

Goldring and colleagues are now working toward a large-scale clinical trial of a color seeing machine (the device tested in the pilot trial was black and white). With the color version, participants can explore a museum gallery containing some of Goldring’s art. When a person gets close enough to a piece, the work is explained in Goldring’s voice.

This work was supported by NASA and by MIT’s School of Architecture and Planning, Center for Advanced Visual Studies, Undergraduate Research Opportunities Program and Council for the Arts.

From MIT




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.