Armed with a cell phone, the real world may soon be a lot more accessible for the disabled, thanks to some UW students. Last winter 10 students in Professor Richard Ladner’s mobile accessibility class used open-source software to create five mobile applications designed to help the blind and hearing-impaired better navigate the world. Among the applications was ezTasker, which reminds Alzheimer’s patients and others with cognitive disabilities to perform—and how to perform—daily tasks such as feeding a pet; MOCR, an optical character recognition application that allows blind users to take a picture of a menu and have the menu read aloud to them; and BrailleLearn, which offers games that encourage Braille learning among blind children. All the applications run on Android smart phones, eliminating the need for expensive—and limited—proprietary medical hardware or software. The best applications, Ladner says, are those that don’t recreate the wheel but make use of existing services. Such is the case with application LocalEyes, which takes advantage of Google maps and GPS and helps the visually impaired solve the problem of knowing what’s around them. With a phone in hand, a blind user might be able to pinpoint what shops are to the left, which restaurants are on the right and, eventually, how many blocks to the dry cleaner. Because the apps run on cell phones, they integrate with, or run alongside, other everyday applications. Not only can an Alzheimer’s patient be prompted to feed the cat, for example, he can also receive calls and e-mail from family members on one device. What’s more, most apps would be free.
Making Mobile Apps for the Disabled