December 16, 2025
Video: Drivers struggle to multitask when using dashboard touch screens, study finds
Once the domain of buttons and knobs, car dashboards are increasingly home to large touch screens. While that makes following a mapping app easier, it also means drivers can’t feel their way to a control; they have to look. But how does that visual component affect driving?
New research from the University of Washington and Toyota Research Institute, or TRI, explores how drivers balance driving and using touch screens while distracted. In the study, participants drove in a vehicle simulator, interacted with a touch screen and completed memory tests that mimic the mental effort demanded by traffic conditions and other distractions. The team found that when people multitasked, their driving and touch screen use both suffered. The car drifted more in the lane while people used touch screens, and their speed and accuracy with the screen declined when driving. The effects increased further when they added the memory task.
These results could help auto manufacturers design safer, more responsive touch screens and in-car interfaces.
The team presented its research Sept. 30 at the ACM Symposium on User Interface Software and Technology in Busan, Korea.
“We all know it’s dangerous to use your phone while driving,” said co-senior author James Fogarty, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “But what about the car’s touch screen? We wanted to understand that interaction so we can design interfaces specifically for drivers.”
As the study’s 16 participants drove the simulator, sensors tracked their gaze, finger movements, pupil diameter and electrodermal activity. The last two are common ways to measure mental effort, or “cognitive load.” For instance, pupils tend to grow when people are concentrating.
Related:
- Story from GeekWire
While driving, participants had to touch specific targets on a 12-inch touch screen, similar to how they would interact with apps and widgets. They did this while completing three levels of an “N-back task,” a memory test in which the participants hear a series of numbers, 2.5 seconds apart, and have to repeat specific digits.
The participants’ performance changed significantly under different conditions:
- When interacting with the touch screen, participants drifted side to side in their lane 42% more often. Increasing cognitive load had no effect on the results.
- Touch screen accuracy and speed decreased 58% when driving, then another 17% under high cognitive load.
- Each glance at the touchscreen was 26.3% shorter under high cognitive load.
- A “hand-before-eye” phenomenon, in which drivers’ reached for a control before looking at it, increased from 63% to 71% as memory tasks were introduced.
The team also found that increasing the size of the target areas participants were trying to touch did not improve their performance.
“If people struggle with accuracy on a screen, usually you want to make bigger buttons,” said Xiyuan Alan Shen, a UW doctoral student in the Allen School. “But in this case, since people move their hand to the screen before touching, the thing that takes time is the visual search.”
Based on these findings, the researchers suggest future in-car touch screen systems might use simple sensors in the car — eye tracking, or touch sensors on the steering wheel — to monitor drivers’ attention and cognitive load. Based on these readings, the car’s system might adjust the touch screen’s interface to make important controls more prominent and safer to access.
“Touch screens are widespread today in automobile dashboards, so it is vital to understand how interacting with touch screens affects drivers and driving,” said co-senior author Jacob O. Wobbrock, a UW professor in the Information School. “Our research is some of the first that scientifically examines this issue, suggesting ways for making these interfaces safer and more effective.”
Seokhyun Hwang, a UW doctoral student in the Information School, is co-lead author. Other co-authors include Alexandre L.S. Filipowicz, Andrew Best, Jean M. Costa and Scott Carter of TRI. This research was funded in part by TRI.
For more information, contact Wobbrock at wobbrock@uw.edu and Fogarty at jfogarty@cs.washington.edu.
Tag(s): College of Engineering • Information School • Jacob Wobbrock • James Fogarty • Paul G. Allen School of Computer Science & Engineering • Xiyuan Alan Shen