Why user-centred design struggles with ethics

For decades we’ve been taught that user-centricity is the key to good design. But now the philosophy is showing vulnerabilities. We need approaches that are less narrow, less transactional, and more able to cope with the diverse, systemic challenges the 21st century has in store. Designer and futurist Cennydd Bowles elaborates.


In those toddler days, when the connected age was still taking its first tremulous steps, digital design was slow to find its identity. Design practises tended to fall into one of two extremes; either the experimental but baffling sandbox of Flash design, or design as web page layout, subsumed in a developer’s daily work, a matter of table hacks and spacer GIFs.


It wasn’t until the early years of the new millennium that the budding UX movement, a chimera of human-computer interaction and library science, brought some rigour and process. The key to the movement’s success was a single, external reference point: the user. We were taught to stop focusing on business or technical whims, and to instead do what’s right for users. Do that, and everything else will fall into place.

It worked, mostly. User-centred design (UCD) helped convince tech companies that design can offer a reliable and genuine competitive advantage. The field matured, companies grew, conferences, books, and celebrities sprang up.


User-centricity became an orthodox view even outside the world of design. Of course, the problem with orthodoxy is that other options start to seem ridiculous.


Discussing the quirks and flaws of the status quo is seen as weird or borderline heretical. But as it belatedly dawns on our industry that technology has serious social and ethical dimensions, one thing is increasingly clear: user-centred design is inadequate for the needs of the 21st century.

Steve Krug’s Don’t Make Me Think — by some distance the best-selling UX book ever — is less a textbook for designers than a primer for marketers, execs, or anyone else who needs to understand why they should fund design. As the title suggests, Krug claims the user shouldn’t have to worry about how a device or product works. It’s the designer’s job to present a simple mental model and then design away — or otherwise hide — complexities that might threaten that model.


‘Don’t make me think’ is an understandable and useful concept that comes with a downside. The mindset tempts us to design products that operate by sleight of hand, that read users’ minds and pull a rabbit out of a hat. It promotes seamless experiences that whisk away all the techie stuff. As a result, we train people to believe they have no business tinkering under the hood of their technologies: “trust us, we’ve got this” is the message.


Little surprise that the general public finds the world of technology dizzying.


Most people have little idea how connected tech works, thanks in part to its seamless design. So we face a dangerous pairing, where we paint opaque technologies with seamless, magical interfaces, providing excellent cover for exploitative data harvesting and transfers.


Data misuse may have been the previous decade’s predominant digital trend. Users don’t understand and can’t correct a system that’s not working in their best interests; instead, they have to put blind faith in developers or hope the OS overlords (Apple, Google, Microsoft) or local regulators have built in enough controls to prevent abuse. So far, those hopes haven’t come true.


Sometimes, making people think is the only way to give them agency, to help them make informed decisions about important ethical questions.

The most obvious flaw? What’s good for the user, may be awful for others. Design a beautiful interface that helps someone buy the SUV of their dreams and you’ll probably achieve business and user success, but at the cost of environmental damage. Build an app to help landlords turn city apartments into short-term vacation rentals? Your two primary user groups — property owners and visiting tourists — will be thrilled, but you’re making neighbourhoods worse, eroding local communities and pushing up rents for genuine residents.


Digital technology’s unique trait is scale; successful apps and products can reach millions of consumers within months. We have to recognise, then, that our responsibilities aren’t just to businesses and users.
We also have a responsibility to society, communities, and cities, to social goods like democracy and freedom, to non-human life, and to the planet itself. Yet these hidden, indirect stakeholders aren’t represented in UCD — instead, we assume every transaction between user and technology is positive and ignore any external damage the transaction may cause.

1. the user isn't that matters

2. make me think?