You are hungry. You get a can from the pantry or some leftovers from the fridge, warm up the food in the microwave and sit down to eat a quick and easy meal.
While that sounds straightforward to many of us, for individuals who are visually impaired or blind, everyday tasks such as finding a specific can of food or using household appliances like microwaves can be a significant challenge.
“The world is not really designed for people with impaired vision,” says Waisman researcher Ender Tekin, “and there hasn’t been enough emphasis on inclusivity in terms of designing appliances or how we navigate through our lives, which makes it even harder for people with vision loss to participate fully in society.”
Tekin is working to harness existing technology—such as mobile phones, tablets and webcams—to access environmental information that is mostly visual and find ways to provide that information to individuals with limited or no vision. He joined the Waisman Center in August 2015 after spending several years as a researcher at the Smith Kettlewell Eye Research Institute in San Francisco, where he worked on applications—or apps—that could use the camera of a smartphone to tell what someone might be holding.
“For example, if individuals with blindness or impaired vision are trying to identify different cans in their pantry, they should be able to grab a can and, without opening it, use their smartphone to know what they are holding,” says Tekin.
The underlying concept was relatively simple: “Most products have a Universal Product Code printed on them and there are databases that match that code to the manufacturer and product,” says Tekin. All the app would have to do is interpret this code, commonly known as UPC, and tell the user what the product was.
But there are significant hurdles to overcome. “None of the apps actually give directions on how to find a bar code; they just assume that the user is able to center the bar code in the field of view of the camera,” says Tekin, “but that’s very difficult, if not impossible, for a visually impaired person to do in a reasonable amount of time.”
Not many people are working on how to best provide feedback to a blind user on how to orient a camera. Tekin and his colleagues at Smith-Kettlewell are trying out different kinds of audio feedback—from directions, such as “move left/right” or “move closer/farther” to a series of beeps that inform the user if he or she is getting closer to placing the camera accurately.
While at Smith-Kettlewell, Tekin and his colleague James Coughlan realized there is another major accessibility challenge for people with vision loss. As more and more appliances switch to LED touch-screen control panels, the lack of tactile operating features—knobs, dials or buttons—on these appliances was making it very difficult for the visually-impaired or blind to use them.
“The first thing we developed is an optical character recognition (OCR) system that works for LED displays on appliances,” says Tekin. OCRs have been around for a while and have several uses—from allowing airport scanning machines to recognize passports to making digital copies of books and documents text-searchable.
But letters on paper and numbers on appliance LCD screens provide different challenges. “There’s often a lot of glare and complex lighting conditions with LCD screens,” says Tekin, “and even the polarizing cameras in smart phones can sometimes cause problems with recognizing what’s being displayed on them.”
Creating an OCR system that works on LCD screens is just the first part of the puzzle. The original problem of how to best guide users with visual impairments so that their cameras are aiming accurately at appliance displays is an issue as well. According to Tekin, because “LED displays usually have a flat surface, for someone with impaired vision, it’s not even easy to tell where the display or the buttons are.”
So individuals with limited vision have to first identify where the display is on the appliance, then interact successfully with the appliance (such as finding the START button on a microwave and pushing it) and finally be able to receive information about the appliance as its operating (a microwave countdown for example). Any device designed to aid them through this process has to be able to help at all three stages.
It’s also important to keep in mind that there’s no single user interface that works for everybody, according to Tekin. “The individuals we work with have different levels of visions and correspondingly different needs. Some of them are completely blind and they prefer that everything is read aloud; others have low vision and may prefer something that magnifies the display and increases the contrast so they can read the display,” he says.
Being at the Waisman Center allows Tekin to work at the intersection of technology and accessibility issues and collaborate with researchers who are experts in different, yet related, fields. For example, Tekin has started collaborating with Waisman researcher Ruth Litovsky on making visual and audio communications more accessible.
“Hearing aids focus on the auditory aspect alone,” says Tekin, “and usually that’s enough because if you have normal vision you can see the person you are talking to and your hearing can benefit from the aid.”
But a growing number of older adults are experiencing both vision and hearing loss, and these individuals often find it difficult to harness visual or auditory information as well as someone with normal vision. “I want to harness technology to take advantage of visual cues—simple ones like identifying the person talking while in a group setting or knowing when someone’s lips are moving—to make it easier for folks with hearing and sight problems to navigate their world,” says Tekin.
As technology advances by leaps and bounds, individuals with hearing or vision impairments often face unique challenges. But new devices can help them connect with friends and families, and new tools can assist in navigating daily lives. “There are a lot of emerging technologies and I want to use them to help make the world more inclusive,” says Tekin.