You’re browsing for flights. The cheapest one has “$89” in bold letters. You click. Suddenly it’s $127. The base fare was $89, but the seat selection you didn’t ask for, the insurance auto-checked at checkout and the “processing fee” appeared magically. You try to uncheck them. The boxes are tiny. The “Continue without insurance” button is grey and not visible. The bright blue “Keep protection” button begs for attention.
That’s not an accident. Every element was intended to manipulate you to spend more than you intended. The tiny checkboxes, the opt-outs that are embedded in the small print, the visual hierarchy that highlights the choices that benefit the company – these are dark patterns. And they’ve been everywhere, in everything, for years and years. Until now.
In 2026, regulators around the world have finally decided that enough is enough. What designers used to consider “clever UX” or “growth hacking” is explicitly illegal in major markets. The rules have changed. The enforcement is real. And the penalties are big enough to make even the largest tech companies rethink how they design interfaces.
Dark patterns are design choices which trick users into doing things they didn’t want to do. They play in the psychology, attention and trust of companies for the benefit of users at the cost of users.
The check box that is pre-ticked and signs you up for newsletters. The “Accept All Cookies” button is in bright colours next to a grey “Manage Preferences” down in small text. The cancellation process, which required phone calls during business hours when signing up, took thirty seconds online. The countdown timer that says “Only 2 seats left at this price!” resets every time you refresh.
These aren’t bugs. They’re features, intentionally considered by product teams that are optimizing for some metric such as conversion rates and average order value, without worrying about how they got those numbers. For years, this worked. Companies made millions because they made it difficult to say no, easy to agree by mistake, and difficult to get out.
A study from the European Commission in 2022 revealed that 97% of the most popular websites and apps used in the EU by consumers deployed at least one deceptive pattern. Not a few bad actors. Ninety-seven percent. Dark patterns became the norm, rather than the exception.
Why Regulators Finally Acted
Governments were content to ignore dark patterns for years because harm was difficult to The Regulations Changing UX in 2026
Multiple laws are now focused on dark patterns from multiple angles, creating an entire regulatory web that is difficult to escape.
The Digital Services Act bans manipulative interface design for all digital services within the EU. This isn’t restricted to the gathering of data. Any interface element that is intended to trick or hinder decision-making is against the DSA.
The Digital Markets Act targets large platforms in particular, including not allowing them to use dark patterns for them to stay dominant. Self-preferencing with interface design, making it more difficult to choose services of competitors, using the default setting – all forbidden to “gatekeepers.”
GDPR enforcement has been more focused on patterns of consent. Regulators have come to believe that manipulative consent flows violate Article 7’s requirement that consent be “freely given.” Companies are faced with massive fines when audits find dark patterns in cookie banners or privacy choices.
The incoming Digital Fairness Act, which will be passed in 2026, will consolidate and clarify anti-dark-pattern rules across the EU. It is specifically aimed at addictive design, exploitative interfaces in gaming and social media, and algorithmic manipulation based on profiling vulnerable users.
In the US, states led by California broadened privacy laws to specifically ban dark patterns that disrupt privacy rights. The FTC’s 2024 study found that 76% of websites and apps it examined use at least one potential deceptive pattern, thus giving the agency ammunition for enforcement actions.
Multiple laws now target dark patterns from different angles, creating a comprehensive regulatory web that’s hard to escape.
The Digital Services Act prohibits manipulative interface design across all digital services in the EU. This isn’t limited to data collection. Any interface element designed to deceive users or impair decision-making violates the DSA.
The Digital Markets Act specifically targets large platforms, prohibiting them from using dark patterns to maintain dominance. Self-preferencing through interface design, making it harder to choose competitors’ services, exploiting default settings—all banned for “gatekeepers.”
GDPR enforcement has intensified around consent patterns. Regulators now interpret manipulative consent flows as violations of Article 7’s requirement that consent be “freely given.” Companies face massive fines when audits reveal dark patterns in cookie banners or privacy choices.
The upcoming Digital Fairness Act, expected in 2026, will consolidate and clarify anti-dark-pattern rules across the EU. It specifically targets addictive design, exploitative interfaces in gaming and social media, and algorithmic manipulation based on profiling vulnerable users.
In the US, states led by California expanded privacy laws to explicitly prohibit dark patterns that interfere with privacy rights. The FTC’s 2024 study revealed that 76% of examined websites and apps employed at least one possible deceptive pattern, providing ammunition for enforcement actions.
Specific patterns that once used to be common are now explicitly banned or severely restricted:
False urgency to create artificial scarcity. “Only 2 rooms left!” When inventory is not actually tracked, or countdown timers that reset, are violations of laws against misleading consumers.
Obstruction of desired actions makes desired actions difficult and the undesired actions easy. Requiring phone calls to cancel when signup was online, hiding the opt-out options, forcing users through multiple confirmation screens to decline – all prohibited.
Forced action bundling consent for unrelated things. Requiring users to accept marketing emails to create accounts, making newsletter signup a prerequisite to downloading content, tying service access to unnecessary data sharing – illegal under GDPR and similar laws.
Interface Interference through visual design to control choices. Making “Accept” buttons big and colourful and “Reject” is small and grey, using confusing language to conceal meaning, placing check boxes where a user accidentally clicks them, all are considered deceptive design.
Sneaking hiding information or costs late in the process. Revealing fees only at final checkout, auto-adding items to carts, and pre-selecting paid options – none of this is permissible under consumer protection laws that require transparent pricing.
Some companies were concerned that these regulations would cause conversion rates to die. Without dark patterns, will someone purchase anything? Would anyone agree to the collection of data?
The opposite happened. Companies that were forced to adopt ethical UX have found that it often works better than manipulation. Transparency builds trust. Trust drives loyalty. Loyal customers are more precious than duped customers.
Companies such as Hey.com made their entire brand based on ethical design. No trackers, no manipulating, complete transparency. Mozilla implemented transparent, toggling consent where users actually control their choices. The New York Times redesigned consent flows to be symmetrical and readable instead of manipulative.
These companies did not sacrifice business metrics. They improved them. Users who are conscious about the act of engaging convert better and stay longer than those who were duped into signing up and then feel duped.
The math works because ethical UX lowers the cost of support, avoids regulatory fines, builds brand reputation, and creates sustainable growth, rather than churning through manipulated users who are burned.
Meeting these regulations isn’t complicated. It means designing for someone’s interest rather than focusing on company metrics.
Early enforcement targeted the biggest and most obvious violators. TikTok paid EUR345 million USD for the default public nature of accounts. Major platforms came under fire for consent mechanisms. High-profile cases were sending clear messages.
But enforcement is moving to smaller companies. Regulators understand dark patterns aren’t just about tech giants. E-commerce sites, subscription services, and apps of all sizes use manipulative design. As enforcement bodies gain experience and establish some process, they’re going after a broader range of violations.
The penalty calculations are important. GDPR fines are up to 4% of annual revenue worldwide. For companies that do billions in revenue, that’s catastrophic. Even more modest fines cause sufficient pain to make it cheaper to comply than to engage in further manipulation.
More importantly, the reputational damage caused by being called out in front of the public for dark patterns is a destruction of trust that takes years to rebuild. Getting grouched gets a front-page story. It says to customers that you knowingly designed against their interests. That’s a marketing disaster no amount of money can fix.
The Digital Fairness Act, coming in 202,6 will fill out the remaining gaps and harmonize enforcement across the EU. It specifically addresses addictive design, social media and gaming, algorithm manipulation, and exploitative personalization.
Expect regulations to move away from overt deception into less overt manipulation. Attention engineering, algorithmic amplification optimised for maximum engagement regardless of user wellbeing, interfaces optimized for addiction instead of value, these are next.
The US is likely to continue with state-by-state expansion of privacy laws, including anti-dark-pattern provisions, with federal legislation possible but in doubt. Other regions are taking a close look at EU enforcement and contemplating similar frameworks.
Globally, the trend is clear: design decisions are legal decisions now. UX isn’t just about users. It’s about compliance, liability and whether your interface violates laws that are specifically designed to prevent manipulation.
The smartest companies aren’t battling these regulations. They’re adopting them as a chance to differentiate with trustworthy design.
Audit your current flows. Identify the manipulative elements before regulators do. Use dark patterns instead of transparent choices. Test whether ethical design actually hurts metrics- usually it doesn’t
Train teams to be aware of manipulation. Designers, product managers and growth teams need to know what’s now illegal and why. “It increased conversions 3%” is no defence when the technique violates user rights.
Design Systems to be Compliance. Create components, patterns, and guidelines to ensure that ethical UX is the default. Make it more difficult to accidentally create interfaces that aren’t legal than to build ones that are.
Most importantly, change the mindset from “what can we get away with” to “what serves users best.” Regulations are forcing what should have been the standard practice all along – respecting people enough to let them make true choices.
Dark patterns aren’t cool growth hacks anymore. They’re manipulation tactics that are illegal and have real penalties. This is absolutely clear with the regulatory environment in 2026.
Ninety-seven per cent of the top sites had dark patterns. That number has to get to zero, not because suddenly companies decided to have ethics, but because laws now dictate that they respect users.
This isn’t the death of persuasive design. It’s time for deceptive design to be over. You can still make powerful experiences, lead users to valuable choices, and optimise for business objectives. You really can’t fool people anymore.
The companies that are winning in this environment are those that figured out ethical UX performs better anyway. Transparency is better than manipulation. Trust beats trickery. Sustainable growth is better than churning through feeling deceived users.
Design like your interfaces will be scrutinised by regulators, because they will. Build like users deserve respect because they do. And remember that the best UX was never about manipulating people – it was about helping them achieve what they actually wanted to do.
That hasn’t changed. We’re just finally getting it to be mandatory.
This is dependent on the pattern and jurisdiction. Obvious deceptive practices, such as concealing fees, artificially making cancellation difficult, or falsely using a sense of urgency are banned in most major markets. More subtle techniques of persuasion are still, of course, legal if they don’t materially impair user decision-making. The key test: is the design helping users to make informed choices, or is it manipulating them into choices that mainly benefit the company? If your design passes that test, you’re usually OK. If it fails, then you’re probably violating regulations in the EU, in California, and more and more other jurisdictions.
Start by auditing for common dark patterns: Are there more steps required to opt out than opt in? Are any boxes pre-checked? Is important information concealed or obscured? Does visual design make company-preferred options radically more prominent? Are cancellation features easily accessible and usable by the users? Is there artificial urgency or false scarcity claims? If you answer yes to any of these, you probably have compliance problems. Conduct user testing for clarity and true understanding, and not just conversion optimisation. Document your design decisions and make sure they are user-centred and not manipulation-focused.
In the EU, violations of GDPR can lead to fines of up to millions of euros, whichever is greater. DSA violation fines can be up to 6% of the annual turnover across the globe. Recent enforcement includes regulators handing out fines in the hundreds of millions for serious violations. In the US, FTC enforcement can involve substantial monetary penalties, enforced refunds to harmed consumers, and consent decrees to enforce continued compliance monitoring. Beyond fines, companies are faced with reputational damage and class action lawsuits, as well as loss of user trust that often costs more than the penalties themselves.
Initially, some metrics may change, but most companies find ethical UX works better in the long run. Users who make authentic and informed choices are more likely to stay as customers and recommend the services,a nd have a higher lifetime value. Ethical design means fewer confounded or frustrated user support tickets, no regulatory fines and legal expenses, brand reputation, and sustainable growth. Companies such as Mozilla and Hey.com prove that transparent, user-respecting design works commercially. The short-term bump to conversion that manipulation might yield isn’t worth the long-term cost for the trust, reputation and legal exposure.
More regulations are going to happen. The EU’s Digital Fairness Act in 2026 will expand and clarify current rules. Expect continued evolution in the design of addiction, algorithms and exploitative personalisation in social media, gaming and e-commerce. Other jurisdictions are keeping an eye on EU enforcement and considering similar frameworks. The trend is toward stricter regulation of manipulative design, not loose standards. Companies should assume the level of regulatory scrutiny will go up, not down. Building ethical UX now helps you prepare for the regulations that haven’t been written yet, while helping users better today.