User experience designers have long known about dark patterns – those sneaky interface tricks that pressure or mislead users into choices they wouldn’t otherwise make. Think of a website shouting "Only 2 left, act now!" when stock is not actually scarce, or a pre-checked box quietly signing you up for a subscription you never intended. These manipulative UX practices have been debated for years, but now a major shift is underway. The European Union’s Digital Services Act (DSA) explicitly outlaws many such dark patterns, marking what could be the beginning of the end for deceptive design. This isn’t just a European concern either. It signals a broader change in digital ethics and product design across the globe.
The term "dark patterns" was coined in 2010 by UX designer Harry Brignull. It describes user interface tricks that deliberately mislead or coerce people into actions they might not take freely — like buying add-ons, sharing personal data, or signing up for subscriptions they didn’t intend. These design choices are not accidents. They are intentional manipulations of user behavior.
These practices are everywhere. Studies have shown that most major shopping sites use at least one dark pattern. The consequences are real. People get charged for things they didn’t want, end up sharing more data than they meant to, or simply feel manipulated. And when users feel tricked, they lose trust.
The Digital Services Act, which came into effect in 2024, is the EU's most ambitious digital regulation yet. Among its many provisions, it takes direct aim at dark patterns. It requires platforms to design their interfaces in a way that respects users’ freedom to make informed decisions.
One of the central points in the legislation states that online platforms must not organize their user interface in a way that deceives, manipulates, or significantly distorts users' ability to make free choices. In other words, if your interface is designed to trick people — by nudging them toward a certain choice, hiding opt-outs, or creating false urgency — you're in violation.
The Act doesn’t just outline general principles. It gives concrete examples. Pre-ticked boxes? Not allowed. Fake timers that restart when refreshed? Also out. Cookie banners that make the "accept all" button bright and bold while burying the "reject" option in a greyed-out link? That counts too.
And this law has teeth. Companies found to be in violation can face fines of up to 6 percent of their global annual revenue. That’s not a slap on the wrist. That’s billions for the biggest platforms.
You might wonder why a European law matters to companies or designers working elsewhere. The answer is simple. Most digital products today serve global audiences. If your site or app has users in the EU, you need to comply with the DSA. But more than that, it sets a new standard for what ethical UX looks like.
Companies don’t want to maintain separate design systems for Europe and the rest of the world. It’s messy, inefficient, and bad for brand trust. Just as GDPR privacy practices became widespread beyond the EU, we can expect the DSA’s dark pattern ban to ripple outward too.
We’re already seeing regulators in other regions take notice. In the US, for example, the Federal Trade Commission has signaled growing concern over deceptive interface designs. California’s privacy laws even invalidate consent gathered through manipulative patterns. It’s becoming clear that what was once called "clever UX" is being redefined as unethical, and in some cases, illegal.
This regulation changes the conversation in product and design teams. Ethical UX is no longer just a nice-to-have. It’s a legal and strategic necessity. Companies are starting to audit their user flows, train their teams, and rethink the principles behind their design systems.
But this isn’t just about avoiding fines. It’s about building better products. When designers focus on clarity, fairness, and user autonomy, the result is a more trustworthy experience. That builds loyalty. And loyalty, in turn, drives growth.
Think about the long-term implications. Without the crutch of manipulative tactics, companies will have to earn engagement through real value. That might be more challenging, but it’s also more sustainable. Ethical design isn’t just about staying compliant. It’s about building digital experiences that people believe in.
The Digital Services Act marks a turning point. By explicitly banning dark patterns, it calls time on a chapter of design that prioritized short-term gains over user trust. What comes next is still unfolding, but one thing is clear: deceptive design is no longer just bad practice. It’s a risk — legally, reputationally, and strategically.
For UX and product professionals, this is a moment to rethink how we define good design. Because in the years ahead, the experiences that stand out won't be the onesthat trick users into clicking, subscribing, or consenting. They’ll be the ones that make people feel respected, understood, and in control.
Ethical design isn’t the future. It’s the new baseline. And that’s a good thing for everyone.