Djordje Vlaisavljevic, Designer
In the first part of this series, we tried to define what it means to ‘design ethically.’ We concluded that things start to get blurry pretty fast and there is no single definite answer. So what if we try a different approach – Is it perhaps easier to define what unethical design means?
For a lot of people, privacy is what first comes to mind. Here’s a good quote from designer and activist Aral Balkan:
“When a company like Facebook improves the experience of its products, it’s like the massages we give to Kobe beef: they’re not for the benefit of the cow but to make the cow a better product. In this analogy, you are the cow.” Aral Balkan
Or maybe you’re more familiar with this simplified version:
“If you’re not paying for the product, you are the product.”
You are the product, and your value is in your personal data. When you bring up this argument, there will always be someone who says something along the lines of “if you’re not doing anything illegal and you don’t have anything to hide, why worry?” But things aren’t that simple. Let’s take a look at what happens with your data that gets collected.
Companies use this data to sell you more
They analyze your interests, habits, and vulnerabilities and then use that to persuade you with ads in order to make you a better consumer. To be fair: companies greatly exaggerate how smart and powerful their targeting is so they can sell more ads, but it’s still a problem.
Data gets sold to third parties
You may have given access to one company, but it can end up in an entirely different place, which is not something you signed up for. Your health records, social security numbers, banking details, location history; all of this gets bundled into neat little packages and sold.
These packages present a simplified model of a person and contain their private, intimate details. There are systems that make automated decisions based on that data which can greatly limit someone’s choices, opportunities and life-chances. The way these systems work is often non-transparent, arbitrary, biased, unfair, and unaccountable.
What does this mean in practice?
For example, there have been cases where employers have been tracking the location data and search history of employees so they could stop them from forming unions and fighting for better working conditions.
Or, your government could use your location history and social media content to mark you as “undesirable” or “terrorist”, like during the protest around the globe in 2020.
This is why “I have nothing to hide” is a naive point of view — it takes for granted that there is a benevolent, objectively good force running everything from corporations to governments behind the scenes. Nothing is black or white.
Your data is not secure
Even if we imagine that governments and companies will never abuse collected data, security is still an issue. There are regular people behind those systems who have access to your data and can leak it or dox you. There are hackers who can steal your data for profit or just for fun. As long as it’s there, somebody can abuse it.
Another way you can create an unethical design is through accessibility. When you’re designing, you’re always making compromises:
- Who benefits from your design?
- Who gets left out, intentionally or unintentionally?
When you say “accessibility”, designers usually mean designing with high contrast and making sure the design can be used via keyboard. But there’s so much more to it than that.
Clubhouse has been the hot new app for quite some time now. It’s described as “drop-in audio chat”, and some of the most important tech discussions and networking happen there. But here are a couple of different ways it’s inaccessible:
- It’s hard to use by visually impaired people because of low contrast and no keyboard support
- It’s impossible to use by hearing-impaired people because of the lack of live captioning of audio
- It’s invite-only, which excludes people who aren’t connected to the elite tech-bro early-adopters
- It’s iOS-only, which excludes Android users
The team behind Clubhouse is clearly communicating that they only want people who can see perfectly, hear perfectly, own iPhones, and who are “culture-fit”. Everybody else gets left out.
You could argue if that’s intentional or unintentional, but there are some real-life examples that are definitely intentional.
This is an example of hostile architecture. The idea is to design objects in public with such a balance so it does its job but makes it impossible for homeless people to use it to survive. A popular example are benches that are kind of okay to sit on, but impossible to sleep on. Or ventilation shafts that homeless people cannot use to stay warm. They’re pretty clear about who doesn’t get to benefit from their design.
When you compare something like an axe or a TV to digital products, there is one big difference. An axe or a TV just sits there and waits until you interact with it. But now, tools are the ones starting the interaction. They’re always on, always bombarding you with notifications, emails, SMS, begging for your attention. They don’t care what you want at that moment, they want your focus right now, and ideally all the time. And this is causing massive anxiety.
This is what the industry calls a dark pattern, a trick that designers use to make the things you don’t want to. This is just one example, but there are dozens:
- Roach motel: Designers make subscribing ridiculously easy, but unsubscribing next to impossible.
- Confirmshaming: Guilting users into choosing something that benefits your company by using labels that shame them into compliance.
- Forced continuity: When a free trial ends, the user silently starts getting charged.
Not anticipating negative behavior
While dark patterns are mostly there intentionally and used to reach business goals no matter what, sometimes it seems people working on the product didn’t realize that their product can be used for harm. This usually happens when there’s a lack of diversity on teams, and in tech, which is still mainly white, male, straight, and neurotypical, that’s often the case. Sometimes they imagine the world as a perfect, happy place, where nothing bad ever happens, a view which clashes with reality.
Here are some potential problems to keep in mind:
- Mental health: How is your FOMO design affecting people who are struggling with depression, or ADHD, or anxiety?
- Traum: Is your “On this day” feature showing your users a celebratory reminder that their friend committed suicide?
- Addiction: Is your addictive, habit-loop app ruining users who have problems with addiction psychologically and financially?
- Abusive relationships: Are your “Find my friends” and location tags helping abusive partners track and control their victims?
So what are the consequences of using unethical design? Why do we keep seeing this sort of behavior again and again?
Big tech companies that embrace unethical design have lower usability, frustrated users, and a loss of trust…
But they are so big and popular that people have no option but to ignore the issues and keep coming back. Yeah, you could put photos of your dog and breakfast on your personal website, but who’s going to see them there?
Sometimes, companies have to pay…
But it’s usually a symbolic amount compared to how much they earn from their unethical practices.
Or they just get scolded
They apologize, promise not to do it again, and then turn around and keep doing the exact same thing. How many times have you watched a congressional hearing where Mark Zuckerberg had to stand in front of old people who don’t understand anything about technology and pretend he’s sorry for something that’s literally the core of his business model?
The truth is, our current system doesn’t really punish this kind of behavior. We’re still celebrating companies and people who do this. It’s up to you to decide if you want to accept it and be a part of it. You can’t really expect to change the underlying system on your own, but you have to start somewhere. And at least you’ll sleep better knowing you’re not actively ruining people’s lives.
In our last part of this series on Design Ethics, we’ll talk about all the part you can play in being more ‘ethical.’
Czarissa Navidad, Designer
Estella Bravo, UI Designer
Djordje Vlaisavljevic, Designer