Between humanity and technology

What makes us human? The UW Center for Neurotechnology partners with the Philosophy Department to examine the ethics of tomorrow.

Joan, an Army specialist, is rumbling down a remote gravel road, the hulking mountains of central Afghanistan barely visible through the dust kicked up by her convoy.

The next moment, the Humvee is nearly torn in half by an improvised explosive device.

It’s not until she wakes up later in a military hospital that reality hits her: She has lost her arm above the elbow, and

her life is changed forever.

After months of rehab in the U.S., Joan participates in a clinical trial for a cutting-edge robotic arm controlled by a brain-computer interface (BCI). A chip implanted in her brain and a network of electrodes comb through the electric chatter, using artificial intelligence (AI) to decode Joan’s intentions —

sending wireless signals to her robotic arm that enable her to perform basic functions.

Joan gets so used to the prosthetic that it’s no longer just a tool —

she feels it has become an actual part of her.

Life begins to settle down again.

Joan is driving when her left arm jerks unexpectedly,

causing the car to swerve off the road and plow through her neighbor’s fence.

Was it a malfunction? A disagreement between her “true” intentions and the device’s AI? Joan feels confused, guilty and alienated from her body — and her identity.

Who is at fault?
Joan, or the software and hardware in her body?

Should her neural data be analyzed by lawyers to determine blame? Does she have any right to personal privacy?

Where does humanity end and technology begin?

Raising the questions

Joan’s dilemma, described by philosophers at the Center for Neurotechnology at the UW, is hypothetical, but it could one day be reality. Neurotechnology — the intersection of neuroscience, technology and computing — has brought within reach treatments and technology for some of the human body’s most vexing problems (including spinal cord injury, stroke, Parkinson’s disease and more). But with those treatments come many areas of potential ethical conflict, and the center aims to raise important questions and awareness before design decisions are already entrenched.

Based at the UW with core partners at MIT and San Diego State University, the Center for Neurotechnology was established in 2011 as an Engineering Research Center funded by the National Science Foundation. And from the very beginning, ethics has been one of its cornerstones.

The center’s neuroethics research is led by Sara Goering, a UW philosophy professor with a background in disability studies and bioethics, and Eran Klein, a neurologist and UW affiliate professor. Center members from the Department of Philosophy work closely with neuroscientists, engineers, doctors and industry professionals to develop effective technology in the most ethical ways.

But Goering and Klein aren’t necessarily here to give answers. They help identify important questions — for researchers, industry and our society to grapple with before it’s too late.

Eran Klein

How were we thinking about privacy in the ’90s, as the internet was being developed? We could’ve been thinking about the costs and benefits of giving up all our data. It would’ve been nice to have that conversation up front.
Eran KleinAffiliate Assistant Professor of Philosophy

Klein gives an example of a time when technology raced past our attention to the ethical consequences: “How were we thinking about privacy in the ’90s, as the internet was being developed? We could’ve been thinking about the costs and benefits of giving up all our data. It would’ve been nice to have that conversation up front.”

Goering summarizes with a punch: “People may want ‘normal’ function. But not at any cost.”

The ethics team aims to ensure that disability perspectives are integrated into the design process at the earliest stages. They also work directly with research participants who are testing the technology of the future, learning more about what works, what doesn’t, and what concerns users may have.

Tim Brown, who was a philosophy doctoral student of Goering’s, was at the forefront of one such project at the UW, as an embedded ethicist in Electrical and Computer Engineering Professor Howard Chizeck’s BioRobotics Lab. It was a deep collaboration that saw Brown working alongside engineering researchers every day, both in the lab and in pilot studies.

Sara Goering

People may want ‘normal’ function. But not at any cost.
Sara GoeringAssociate Professor of Philosophy
Tim Brown

How do we engage with technology? What is the boundary between humans and technology? And what makes us, us?
Tim BrownNIH Postdoctoral Scholar, UW Neuroethics Group

This technology is personal

As electrical engineers monitor the dips and spikes dancing on a computer monitor, Brown asks study participant Fred Foy to touch his own nose, then Brown’s finger, then his nose again, then Brown’s finger again. Known as the Fahn-Tolosa-Marin rating scale, it’s a way for researchers to gauge the severity of essential tremor — a condition that affects 7 million Americans.

Foy is in his 80s. He walks with a cane, likes to stay up reading magazines and is going on a date after his appointment. He also has an electrode implanted in his brain. Without it, his hands tremble constantly, making it difficult to do basic tasks like drinking water.

Normally, Foy’s implant uses open-loop deep-brain stimulation (DBS) to treat his condition. Tiny electrical pulses are delivered to Foy’s brain, reducing his tremors to a more manageable level. But “open loop” means it’s always on — so potential side effects, including trouble with speaking and balance, can also be constant.

Center researchers at the BioRobotics Lab are working on a solution. Their next evolution of deep-brain stimulation, known as closed-loop DBS, uses machine learning to sense incoming tremors and toggle on and off as needed — prolonging the life of the battery (which requires surgical replacement) and reducing side effects.

In partnership with medical device manufacturer Medtronic, the creation of this first in-human closed-loop DBS system was led by Jeffrey Herron, ’16, a UW assistant professor of neurological surgery and center faculty member who earned his Ph.D. at the UW. And Foy is one of the first to test this technology.

Engineering researchers temporarily upload new algorithms to Foy’s device, turning it from open- to closed-loop. Then Brown helps run speech and mobility tests.

Foy draws a spiral, which is much smoother with closed-loop DBS than without. He capably lifts a bottle as if taking a drink of water — a task made challenging by his essential tremor, but made easier by closed-loop DBS. When he’s asked to name as many animals as he can think of in 30 seconds, the device automatically reduces stimulation, making it easier for Fred to speak.

Then Brown asks Foy deeper questions about his experience.

Have your moods, personality, thoughts or behaviors changed because of your device?

Have you ever felt that your actions were not your own because of the device?

Do you feel a stigma associated with this device?

The conversation soon turns to cyborgs, and Foy’s response is a frank reminder that this technology is personal:

I don’t like the term “cyborg.” I’m me, and this is a tool that helps me. That’s it.

More than engineering

For four years, Brown had a desk in the BioRobotics Lab. Chizeck had specifically asked the center for an embedded ethicist.

“If you’re going to modify someone’s brain function and capabilities, there’s more to think about than just the hard engineering,” Chizeck says. “I wanted an ethicist in the lab, someone trained in the literature of ethics. I didn’t think there was any other way to do it.”

By being there — surrounded by electrical and computer engineering grad students, remote-control surgical robots, brain-computer interface devices, hacker magazines and stacks of empty coffee cups — Brown could talk with engineers as they went about their day-to-day work, helping them address concerns and familiarizing them with complex ethical concepts.

“If you don’t have a desk in the laboratory, you miss out entirely,” Brown says. “It allowed people to drop by and ask me random questions that popped in their heads. Big, tough questions.”

Brown earned his Ph.D. in December, but he continues to work with the UW neuroethics group as an NIH postdoctoral scholar. His current project, run by Goering and Klein, is to help create a conceptual map of how different types of brain-computer interfaces (BCIs) affect agency — a person’s sense of control over their own actions.

Not every BCI device impinges on a user’s sense of agency. But, for instance, if a person is fully paralyzed, and artificial intelligence helps “read” their brain activity and take action, they might feel they’ve traded in a significant amount of agency for technology that helps them accomplish daily tasks.

Fundamental questions

Goering, Klein and Brown are proud of how the Center for Neurotechnology is baking a philosophical perspective into the technology of the future. The center develops tools and ethical guidelines for this technology — and shares them with an international audience.

There’s much to feel good about. Brown says he’s seeing more partnerships between ethicists and technological innovators at the UW and other universities. Philanthropic support may unlock even more important research and collaboration. And BCIs have tremendous potential to improve people’s lives. He notes, though, that there is still a lot to explore:

“We have an opportunity to answer some really fundamental questions about the way humans operate — questions that have deep implications for how we think we should be. How do we engage with technology? What is the boundary between humans and technology? And what makes us, us?”

Originally published May 2020

What you care about can change the world

The University of Washington is undertaking its most ambitious campaign ever: Be Boundless — For Washington, For the World. When you support the Center for Neurotechnology Neuroethics Fund, you can help us develop effective technology in the most ethical ways.