AccessComputing Co-PI and Partners Win 2019 SIGACCESS ASSETS Paper Impact Award

Shaun K. Kane, Jacob O. Wobbrock, and Jeffrey P. Bigham

Every other year, the ACM Special Interest Group on Accessible Computing (SIGACCESS) awards a past paper from the ASSETS conference a prestigious lasting impact award. The chosen paper must be at least 10 years old and have made a demonstrable impact on research or practice in the field of accessible computing. This year, at ASSETS 2019, the award was given to a paper I authored with two AccessComputing partners: Shaun K. Kane of the University of Colorado, Boulder; Jeffrey P. Bigham of Carnegie Mellon University. Their paper from 2008 was entitled, “Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques.” Kane and Bigham were graduate students at the University of Washington when they did their work on Slide Rule.

The Slide Rule project was the first work to tackle the challenge of making touch screen devices accessible to blind people. Prior to 2007, I had conducted research on the accessibility of mobile devices, including mobile phones, that commonly employed tactile features such as raised buttons, dials, and knobs, which meant such devices could be operated, at least somewhat, by touch and hearing. But in 2007, with the advent of the Apple iPhone, all of the tactile landmarks went away. I was concerned that without these physical landmarks on phones, blind people or people who cannot see their devices would have no way to operate them. My then-Ph.D. student Shaun Kane and I took on the challenge to create the first finger-driven screen reader, which Kane built to intelligently read out text wherever a finger is touching the screen. If the finger moves faster, the text read-out is abbreviated. If the finger dwells on a location, the text is read fully. We also created a method by which a user could select targets on the screen without ever lifting the reading finger: upon hearing a target read aloud, such as a button label or hyperlink text, a second finger could tap anywhere on the screen and that target just read would be triggered. This technique has come to be known as a “split tap” gesture in today’s smartphone products. We also created gesture techniques for navigating lists and hierarchies, and transitioning between apps.

The success of Slide Rule was initially far from clear. “I talked to many other researchers about this idea,” said Kane. “Nobody thought it would work for users. They all told me it was bound to fail.” To compare Slide Rule to a de facto means of accessing a touch screen, Wobbrock recruited Jeffrey Bigham, then a Ph.D. student in computer science at the University of Washington and an expert on screen reader technology, to create a conventional screen reader as a comparison condition. “I thought for sure the conventional screen reader would win in our studies,” Bigham said. The team recruited blind participants to use both devices and perform tasks such as finding information, navigating among applications, and selecting targets. Slide Rule came out the clear winner, and users generally liked it as well. We were delighted and surprised by the result.