Design Systems for Ethical Design — Part 2

Craig Villamor
cvil.ly
Published in
8 min readNov 18, 2022

--

Part 2 of 4: Dark Patterns

In this four-part series, I challenge design systems professionals to take on one of the most pressing issues in technology today — creating more ethical design.

Just landed here? Start with Part 1: We’ve Lost the Plot

Part 2: Dark patterns
Part 3: Why you should care
Part 4: What you can do

Unethical design has a name: Dark Patterns

The funny thing about discussing ethics is that it’s much easier to talk about what’s unethical. We have a name for unethical design, it’s called a dark pattern. Here’s how I am defining a dark pattern:

UI designs that trick or manipulate users into doing something they wouldn’t normally do or to act against their own or society’s best interests.

Patterns are something that the design systems community really understands. We can name them, classify them, codify them, improve them and, if necessary, we can deprecate them.

11% of shopping sites and 95% of free Android apps deploy some form of dark pattern

Dark patterns are pervasive, with recent studies finding 11% of shopping websites (Mathur et al. 2019) and fully 95% of free Android apps (Di Geronimo et al. 2020) were deploying some form of dark pattern. And the use of dark patterns is growing, according to an FTC report filed this September.

“Our report shows how more and more companies are using digital dark patterns to trick people into buying products and giving away their personal information”

– Samuel Levine, Director of the FTC’s Bureau of Consumer Protection

This isn’t too surprising, is it? Dark patterns are common enough that you probably encountered several this week, perhaps even today. All you need to do is open your inbox to find them. Where is that “Unsubscribe” link?!?

Dirty Tricks & Manipulation

I think of dark patterns in two major buckets — dirty tricks and manipulation. While there is some overlap between the two, I find the distinction helpful.

Dirty Tricks

These are your run-of-the-mill dark patterns. They are designed to trick users or just create annoying circumstances that cause them to do what the business selfishly wants.

There are a lot of dark patterns out there, and a lot of dark pattern libraries full of examples. Below we see 5 categories of dark patterns from UXP: Nagging, Obstruction, Sneaking, Interface Interference, and forced action. But there are plenty more, like Bait & Switch, Privacy Zuckering, Friend SPAM, and Roach Motel. Looking at all of these dark patterns and their classifications can be kind of fun at first, and then it just gets depressing.

Image showing the dark pattern names explained in the text, but floating on a black background with icons like a fish, a can of SPAM, Zuckerberg’s face, and a roach.
A smattering of dark pattern types.

Just to give you a couple of examples, here’s one from a few years back. The team at Zynga thought they were really clever by following the letter of the law with an unsubscribe link in their marketing emails but they used white text on a white background to make it invisible. Isn’t that cute??

Two images: on top is the Zynga email unsubscribe link in white text on a white background, invisible to sighted users, and below is that text highlighted so that you can actually see the unsubscribe link text.
Zynga thought they were pretty clever back in 2018 when they hid the unsubscribe link.

And here’s one that I keep running into. I am a Gmail user who prefers Safari to Chrome. Google would much prefer that I use Chrome, so they tell me this again and again through a nifty little dialog. This is advertising disguised as UI. There’s no way to permanently discmiss, so I can only temporarily dismiss this message even though I have no intention of switching browsers.

Dialog from Gmail says “Google recommends using Chrome. Easily search on Google with the fast, secure browser.” Two buttons appear below: “Don’t switch” in text and “Yes” in a prominent blue button. There is no option to dismiss permanently.
There’s no way to permanently dismiss this ad for Google Chrome.

On the plus side, Dirty Tricks are relatively easy to spot once you know what to look for. Setting aside growth hacking cultures and office politics, most of these dirty tricks are easy to identify and fix. In each of these examples, a design system component could help ensure a more ethical solution — that text is always legible (and accessible!) and that promotional dialogs always have a “don’t show this again” option.

Manipulation

Manipulation is the other kind of dark pattern. It’s more subtle and harder to identify than Dirty Tricks. That’s by design, of course, and part of what makes them so effective. Manipulation harnesses human psychology and emotions to drive profits, often creating habitual behaviors in the process.

Art installation: Can’t Help Myself, 2016, by Sun Yuan & Peng Yu at Guggenheim Museum, NYC

Speaking of habitual behavior, above is an exhibit at the Guggenheim titled “Can’t Help Myself”. Here’s the official description of this piece (also acting as alt text for above GIF):

The robot has one specific duty, to contain a viscous, deep-red liquid within a predetermined area. When the sensors detect that the fluid has strayed too far, the arm frenetically shovels it back into place, leaving smudges on the ground and splashes on the surrounding walls.

In other words, the “Can’t Help Myself” robot has a task that’s never done. Sound familiar? Like we humans who scroll, swipe, and tap social media feeds for multiple hours a day, we know it’s not good for us but we can’t seem to help ourselves.

A user swipes though dance videos on TikTok in rapid succession.
We can’t help ourselves with TikTok

Unintended Consequences

Not all manipulative patterns start out intentionally.

Photo of Aza Raskin being interviewed. He’s wearing glasses, facial hair, a beanie, and a button-down denim shirt as he gestures toward an interviewer at the edge of the frame.
Designer Aza Raskin, Photo credit: BBC News

Here’s Aza Raskin (above). He invented infinite scroll way back in 2006. Before infinite scroll, users would have to load content in batches, say 20 items at a time, then click “Next” when they reached the bottom of the page. It was cumbersome and often slow. Imagine consuming TikTok feeds this way!

Are all “frictionless” experiences a good thing? Maybe not.

Infinite scroll meant that the site or application would do all that heavy lifting for you. It would fetch more content before reaching the end of the page, effectively eliminating paging and creating a “frictionless” consumption experience. Frictionless experiences are kind of the holy grail of experiences in tech so this was great, right? Maybe not.

The assumption that making something easier to use is better for humanity was dismantled by [infinite scroll]… I should have spent more time thinking about the philosophy and the responsibility that comes along with the invention.

- Aza Raskin, GQ Magazine, June 2021

Aza didn’t realize the consequences of his invention at the time. This one invention gave rise to addictive products like Facebook, Instagram, Twitter, Snap, and TikTok. It helped usher in…

The attention economy

The attention economy shifted revenue streams from users who pay for software to businesses who pay for user data.

More attention from users (i.e. more engagement) means more data, more targeted advertising, and massive revenue for anyone who can command more of your attention and sell it to the highest bidder. This is the driving force behind extractive user experiences.

A diagram showing an image of a man saying “look at me” with an arrow leading to a tube of 1s and 0s followed by an arrow leading to Scrooge McDuck and a pile of money.
Engagement drives the collection of user data that generates money through advertising.

Good for metrics, bad for humans

As we learned from infinite scroll, making something easier to do means people typically do a lot more of it. Removing friction can be a good thing when it benefits you, but a very bad thing when it benefits businesses at your expense.

Removing friction from an experience can create usage patterns that look a lot like addiction and addiction is great for engagement.

Take sharing for example. It’s a core function of any social network. Sites like Twitter and Facebook make sharing really simple. The problem is what people often choose to share.

The twitter logo contained in a chat bubble that contains another twitte logo + chat bubble, that contains a third twitter logo + chat bubble.
Image credit: Tech Crunch

Our survival instincts bias us toward negativity. Humans are naturally attracted to things that pose a threat, get us angry, or make us fearful. If a product team or organization is looking to maximize engagement, its actions and its algorithms are likely going to bias toward the negative because it works.

Amusing ourselves to death?

As Neil Postman describes in his still relevant book about television, the job of the app, like the job of the television, is to entertain, not to inform. Unlike television, however, it’s radically easy to share. Good, bad, factual, or farcical content can go “viral”, spreading quickly across the globe.

Cover of the book Amusing Ourselves to Death
Amusing Ourselves to Death by Neil Postman

“Going viral” just doesn't have the same ring to it

But virality remains the primary objective for social media companies and their algorithms because more attention means more revenue. It’s like we selectively forgot that viruses are not good for humans. But this map from Johns Hopkins (below) and our own recent experiences tell us how devastating “going viral“ can be.

A map of global COVID spread from Johns Hopkins
Global COVID-19 infections, source: Johns Hopkins University & Medicine

Damn the Consequences

While manipulation is occasionally the result of unintended consequences, as in the case of infinite scroll, more frequently it’s a matter of “damn the consequences”.

Frances Haugen testifies before Congress

Frances Haugen is a former PM at Facebook (now Meta). She became a whistleblower and testified before congress, where she had this to say about Facebook’s practices:

Facebook, over and over again, has shown it chooses profit over safety.

- Frances Haugen, former PM at Facebook

According to some of her leaked research 13.5% of UK teen girls say their suicidal thoughts became more frequent after starting on Instagram and 17% of teen girls say their eating disorders got worse after using Instagram.

And a former Facebook executive had this to say about how he thinks of social media in his own life:

I can control my decision, which is that I don’t use that sh%t. I can control my kids’ decisions, which is that they’re not allowed to use that sh%t… The short-term, dopamine-driven feedback loops that we have created are destroying how society works.

- Chamath Palihapitiya, former VP of User Growth at Facebook

In Part 3: Why you should care, I’ll offer up 6 reasons why the design systems community should care about the problem of unethical design.

Appendix

You can find all of my resources and references for this series here.

--

--

General Partner at productXP, product leader, designer, tech and gadget nerd. Pragmatic optimist.