September 19, 2025
Q&A: UW professor’s book explores how ‘technology is never culturally neutral’

In her new book, Katharina Reinecke explores how “digital culture shock” manifests in the world, in ways innocuous and sometimes harmful.Princeton University Press
“Culture shock” describes the overwhelm people can feel when suddenly immersed in a new culture. The flurry of unfamiliar values, aesthetics and language can disorient, discomfit and alienate. In her new book, “Digital Culture Shock,” Katharina Reinecke argues that technology can similarly affect people. Reinecke, a University of Washington professor in the Paul G. Allen School of Computer Science & Engineering, uses the phrase to “describe the experience and influence of actively or passively using technology that is not in line with one’s cultural practices or norms.”
The book explores how the self-driving cars trained on U.S. streets would likely struggle to translate to Cairo, with its drastically different road norms. It looks at how Yahoo! Japan, with its complex search interface, can overwhelm Americans used to Google’s minimalist design. And Reinecke digs into how so much technology emanating from specific regions, such as the Bay Area, can lead to forms of cultural imperialism.
UW News spoke with Reinecke about the book and how digital culture shock manifests in the world, in ways innocuous and sometimes harmful.
What was the spark that led to this book?
Katharina Reinecke: Maybe it was less of a spark and more of an embarrassment, but around 20 years ago I worked in Rwanda on developing an e-learning application for agricultural advisors in the country. When I presented the software I’d developed to some of the advisors, they very politely told me that they didn’t like the way it looked and didn’t find it intuitive to use. I realized that my cultural background had influenced all the little design decisions made while developing it: whether the interface should be colorful or simply gray and white, which I thought most people would prefer; whether users should be guided through the application or mostly explore on their own. The answer to any of these questions depends on a user’s upbringing, education, norms and values.
Once I realized that technology is never culturally neutral, I set out to earn a doctorate on this topic and the rest is history. Over the years, I kept collecting similar technology blunders. It turns out, like me, most people have no idea that their culture affects how they use technology and how they develop it. It’s just not something we usually think about or get taught.
Is there an example of digital culture shock that stands out to you the most or is particularly illustrative? Why?
KR: AI is all over the news these days, so let me start there. When ChatGPT and other generative AI tools came out, I think it really illustrated how its developers had made several design decisions that make these tools work well for some, but not all people. They are trained on mostly English data sources on the web, so early language models told us things like “I love my country. I am proud to be an American” or “I grew up in a Christian home and attended church every week.” Obviously this would make many people aware that the AI is different from themselves.
We found that the way that these language models speak and what values they convey is only aligned with a tiny portion of the world’s population while others can experience these interactions as a form of digital culture shock. And this is true for any AI application out there from text-to-image models that generate pictures of churches when asked for houses of worship (as if churches are the only reasonable response) to self-driving cars trained in the U.S., which would likely not succeed in places where tuk-tuks and donkey carts share the road.
You discuss how much of the study of technology is conducted by and with people who are WEIRD, or Western, Educated, Industrial, Rich and Democratic. What are risks of a homogenous digital culture that can emerge from this?
KR: The biggest risk is that technology will continue to be designed in ways that work for people most similar to those in the largest technology hubs, but that it is less usable, intuitive, trustworthy and welcoming to the rest of us. This risk has ethical consequences because technology should be equally usable and useful for all, especially given companies’ enormous profits. There are also several examples in my book that clearly show technology products can struggle to gain market share in cultures it was not designed for, so ignoring this is also risky for companies.
As I discuss in the book, digital technology has been called out as a form of cultural imperialism because it embeds values and norms that are frequently misaligned with those of its users. This would be less of a problem if technology were designed in various technology hubs around the world, representing a diversity of cultures and values. But it is not. Most of the technology people use, no matter where in the world they are, was designed in the U.S., or it was influenced by user interface norms and frameworks developed in the U.S. So we’ve gotten ourselves into a situation where technology is slowly homogenizing and where people can best use it if they think and feel like its developers.
You finish the book with 10 misassumptions about technology and culture. What’s the single greatest, or most consequential, misassumption?
KR: To me, it is that people tend to think that one size fits all. They design technology and expect it to work for everyone, which is obviously not true.
For example, the Western obsession with productivity and efficiency often comes at the expense of interpersonal interactions. So many technology products are hyperfocused on making our days more efficient. There’s an app for any of our “problems,” and all of them try to somehow get us to function better, faster and more productively. But this laser-focus on streamlining misses the point that in many cultures, productivity works differently. In many East Asian cultures, for example, it takes time to build relationships before people will trust another person’s information — or that given by AI. So we need to get rid of the misassumption that technology design can be universal. My job would certainly be so much easier if people would stop believing this!
For more information, contact Reinecke at reinecke@cs.washington.edu.
Tag(s): College of Engineering • Katharina Reinecke • Paul G. Allen School of Computer Science & Engineering