UW News

May 21, 2018

Designed for evil: How to make bad technologies better

UW News

Nancy Tran shows off her group's redesign of FakeApp

Nancy Tran shows off her group’s redesign of FakeApp, an app that uses machine learning to replace one person’s face with another’s in a video. The group developed a method that would deter amateurs from using FakeApp to create realistic-looking fake videos. Their design won the competition.Sarah McQuate/University of Washington

Many of today’s technologies leave users feeling like they got lost in a time vortex — they resurface hours later with no memory of what just happened to them.

But it doesn’t have to be that way, according to Alexis Hiniker, an assistant professor at the University of Washington’s Information School.

That’s why she developed a new upper-level class that gives informatics majors a crash course on ethics. Then they can use these ideas to combat potentially problematic new technologies.

Through Designing for Evil, which is unique to the UW, Hiniker’s students have identified “emerging evils” and redesigned these technologies so that they are more likely to enhance — not detract from — users’ lives. They presented their findings May 23 in a mini-symposium. See below for photos from the event.

Alexis Hiniker

Alexis Hiniker

“Even well-intentioned designers will make mistakes, and when you create technologies that millions or even billions of people use, every design decision has far-reaching impact,” Hiniker said. “We want to better prepare our students to design products with a more robust definition of what it means to do the right thing. That goes beyond thinking: ‘I would like using this.'”

Hiniker divided her course into two sections. In the first half, the students surveyed a variety of political philosophies, like consequentialism and deontology, to learn about moral reasoning and then translate it into building ethical technologies.

During the second half, the students applied their knowledge to a variety of case studies, including artificial intelligence, surveillance technology and technology that’s designed to be addictive. For each topic, they discussed potential pitfalls and then practiced making better technologies.

“In our studio exercises, students worked through design challenges, like taking a particular ethical idea and using it to redesign Reddit,” Hiniker said.

In addition to the case studies, the students have been working in groups on their final projects. Each group has selected a technology, ranging from the seemingly benign WebMD website to the potentially sketchy social credit score that China’s rolling out.

“The first goal of the project is to scare people: Hey! You haven’t thought about how bad this is and it’s already happening,” Hiniker said. “And then the second goal is show people that we can take that scary thing and redesign it in a way that’s grounded in both philosophy and user input. Then you can do a lot better.”

Hiniker hopes that the class will help her students think more about the bigger picture as they enter the tech industry after graduation.

“We need to design technology so that people can access its value without opening a Pandora’s box that they can’t control,” she said. “And if you want to shift the design approach, you need to understand what it is you want to stand up for and why. Otherwise it’s really easy for the status quo to win.”

Stephanie Lim explains how her team made WeChat, a popular social networking app, better.

Jimmy Nguyen and his team talk about how Chinese people view their new social credit system as a good way to hold people accountable for their actions. But the team had some ideas to increase transparency and give the users some control over their data.

Zhanna Voloshina explains how her group would make the WebMD website more inclusive and help people more easily select symptoms.

Sabrina Niklaus discusses how The League, a dating app, could be considered exclusive and elitist. Her group redesigned the app so it could keep some of its unique features without being seen as discriminatory.

Eric Jacobson (center) and Yuliya Labaz explain how their "productive mode" would give Netflix users a choice to limit binge watching.

John Akers' group worked with FakeApp, an app that uses machine learning to replace one person's face with another's in a video. The group developed a method that would deter amateurs from using FakeApp to create realistic-looking fake videos. The redesigned FakeApp won the competition.

Quan Nguyen (back, center) and Kathryn Brusewitz explain their new guidelines for voice assistants like Google Duplex or Apple's Siri.

Royce Le (right) discusses how his group would help Tinder users focus less on physical traits and get to know each other the way they would if they met at a coffee shop.

Estelle Jiang explains how her team made WeChat, a popular social networking app, better.

Adam Bourn (left) discusses his group's approach for redesigning Hello Barbie. Their model would respect children's privacy while addressing parents' concerns.

Kyle Williams-Smith (left) and Hassan Farooq discuss their redesign of Tinder. Their new app would help users focus less on physical traits and instead get to know each other the way they would if they met at a coffee shop.

Estelle Jiang demonstrates one of the current drawbacks of WeChat: It shares your location with users all over the world, even when you've stopped using that part of the app.

The FakeApp redesign would add a moving watermark across its videos to deter amateurs from trying to create and disseminate fake videos. This design won the competition.

This slideshow shows Hiniker’s students presenting their work at the May 23 mini-symposium. Photos by Sarah McQuate/University of Washington.

Tag(s):