Design EthicsDark PatternsUX Best PracticesDesign PolicyThought Leadership

Why We Need to Stop Calling Them 'Dark Patterns' and Start Naming the Intent

The term "dark patterns" is vague and unenforceable. Learn why we need precise terminology like Forced Continuity, Price Obfuscation, and Coerced Disclosure to regulate deceptive design. Includes 7-category taxonomy, legal examples, and why naming the intent makes design teams accountable.

Simanta Parida
Simanta ParidaProduct Designer at Siemens
22 min read
Share:

Why We Need to Stop Calling Them 'Dark Patterns' and Start Naming the Intent

We've all been there:

You click "Start Free Trial," and somehow you're now subscribed to a $49/month plan with no easy way to cancel.

You try to close your account, but the button is hidden behind three nested menus, two confirmation screens, and a "Are you really sure?" guilt trip.

You add an item to your cart for $29.99, and at checkout, the total is $47.83 because of "shipping protection," "handling fees," and a mysteriously pre-checked "premium support" box.

We call these dark patterns. The term has become shorthand for any deceptive or manipulative design practice. It's used in academic papers, policy discussions, and design ethics courses.

But here's the problem: "dark pattern" is a vague, aesthetically-loaded term that describes the outcome (the "dark" feeling), not the specific cognitive manipulation being deployed.

And if we want to regulate, teach, and eliminate deceptive design, we need to stop using cute, ambiguous umbrella terms and start naming the intent.


The Problem with "Dark Patterns" as Terminology

The term "dark patterns" was coined by Harry Brignull in 2010 to describe user interfaces designed to trick users into doing things they didn't intend to do.

It's catchy. It's memorable. It's been widely adopted.

But it's also:

1. Aesthetically Misleading

The word "dark" implies something sinister in appearance—shadowy, hidden, visually obscure. But many dark patterns are bright, colorful, and highly visible.

  • A giant green "Yes, sign me up!" button next to a tiny gray "No thanks" link isn't visually dark—it's intentionally imbalanced.
  • A fake countdown timer screaming "ONLY 2 SPOTS LEFT!" in red and yellow isn't hidden—it's loud and designed to create urgency.

The manipulation isn't in the darkness. It's in the misdirection of attention, action, or understanding.

2. Legally Unenforceable

Regulators like the FTC, GDPR enforcers, and consumer protection agencies can't regulate "dark patterns" as a category. The term is too broad.

They can, however, regulate:

  • Deceptive pricing practices (hiding fees)
  • False scarcity claims (fake countdown timers)
  • Obstruction of cancellation (making it unreasonably difficult to exit)

Legal frameworks require specific, actionable language. "Dark pattern" doesn't provide that.

3. Pedagogically Weak

When teaching designers about ethical UX, "dark patterns" creates a binary: light vs. dark, good vs. bad.

But real-world design decisions are rarely that simple. Sometimes a design pattern is:

  • Persuasive (ethical nudge)
  • Coercive (unethical manipulation)
  • Contextually dependent (ethical in one scenario, unethical in another)

Using vague terminology allows designers to rationalize: "Well, it's not that dark."

But if you name the technique precisely—"Are we using Price Obfuscation to hide the true cost?"—the conversation shifts from subjective feelings to objective actions.


The Shift: Naming the Intent, Not the Feeling

Instead of calling everything a "dark pattern," we should name what the design is actually doing to the user's cognition, decision-making, or agency.

Here are some of the most common "dark patterns" renamed to reflect their true intent:

Example 1: The "Roach Motel" → Forced Continuity

What it is: Making it easy to sign up for a service but deliberately difficult to cancel.

Common implementations:

  • Sign-up is a single button click, but cancellation requires calling customer service during business hours
  • Cancel button is hidden in account settings behind multiple nested menus
  • Users are forced to go through exit surveys, retention offers, and guilt-inducing copy before they can cancel

Why "Forced Continuity" is better: It names the intent: forcing users to continue a service they no longer want by introducing friction at the exit point.

It's specific. It's measurable. It's enforceable.

Real-world example: The FTC sued Adobe in 2024 for using forced continuity by hiding early termination fees during sign-up and making cancellation intentionally difficult. The legal complaint didn't use the term "dark pattern"—it named the specific practice.


Example 2: The "Bait and Switch" → Misdirection of Action

What it is: Making users think they're taking one action, but the interface performs a different action.

Common implementations:

  • "Download" button that actually starts an installer for bundled software
  • "Continue" button that opts you into marketing emails (when you expected it to just move to the next step)
  • "X" to close an ad that actually opens the ad in a new tab

Why "Misdirection of Action" is better: It describes the cognitive trick: the user intends to do X, but the interface makes them do Y.

It focuses on the manipulation of intent, not the visual deception.

Real-world example: Many browser extension install flows use this pattern. You click "Install," expecting to add the extension, but you're also unknowingly agreeing to change your default search engine.


Example 3: Hidden Costs → Price Obfuscation

What it is: Hiding, delaying, or obscuring the true cost of a product until late in the purchase flow.

Common implementations:

  • Showing "$9.99" on the product page, but adding $15 in fees at checkout
  • Pre-checking "insurance" or "priority support" boxes that inflate the total
  • Displaying prices excluding VAT/taxes until the final confirmation screen

Why "Price Obfuscation" is better: It's a precise term for a specific deceptive practice: deliberately making the true cost unclear.

It's also already used in legal and regulatory contexts (e.g., EU price transparency laws).

Real-world example: Ticketing sites like Ticketmaster have been criticized for price obfuscation—a $50 ticket becomes $73 after "service fees," "processing fees," and "delivery fees" are added at checkout.


Example 4: Fake Scarcity → Deceptive Use of Urgency

What it is: Creating artificial time pressure or scarcity to rush users into a decision.

Common implementations:

  • "Only 2 rooms left!" when there are actually 47 available
  • Countdown timers that reset when you refresh the page
  • "3 other people are looking at this right now" notifications that are randomly generated

Why "Deceptive Use of Urgency" is better: It names the psychological manipulation: false urgency to bypass rational decision-making.

Real-world example: Booking.com was fined €475,000 in 2019 by the Dutch consumer authority for using fake scarcity messages like "Only 1 room left!" when more rooms were actually available.


Example 5: Privacy Zuckering → Coerced Disclosure

What it is: Tricking users into sharing more personal information than they intended.

Common implementations:

  • Defaulting all privacy settings to "public" and burying the controls
  • Making the "Accept All Cookies" button large and colorful, while "Manage Preferences" is a tiny gray link
  • Requiring users to create an account and provide personal info just to view content

Why "Coerced Disclosure" is better: It describes what's happening: users are being pressured or tricked into disclosing data they wouldn't freely share.

Real-world example: Many GDPR cookie banners use this pattern. The "Accept All" button is prominent, while rejecting non-essential cookies requires 5+ clicks through nested menus. This isn't informed consent—it's coerced disclosure.


Example 6: Confirmshaming → Guilt-Based Rejection

What it is: Using guilt or shame to manipulate users into accepting an offer.

Common implementations:

  • Opt-out buttons that say "No thanks, I don't want to save money" instead of just "No"
  • Exit pop-ups that say "I guess I don't care about my health" when you decline a fitness app
  • Unsubscribe links with copy like "I want to miss out on exclusive deals"

Why "Guilt-Based Rejection" is better: It names the emotional manipulation: making users feel bad for declining.

Real-world example: Email pop-ups that say "No thanks, I prefer paying full price" instead of "Close" or "Not interested."


Example 7: Sneak Into Basket → Undisclosed Addition

What it is: Adding items to the user's cart without explicit consent.

Common implementations:

  • Pre-checking "Add gift wrapping for $4.99" during checkout
  • Automatically adding "expedited shipping" as the default option
  • Bundling "warranty protection" into the cart without clear opt-in

Why "Undisclosed Addition" is better: It's precise: something was added without clear user intent.

Real-world example: Airlines that automatically add travel insurance or seat selection fees to your booking unless you actively uncheck them.


Why Precision Matters: The Regulatory Angle

Language shapes law.

If we want to regulate deceptive design, we need specific, enforceable definitions, not vague umbrella terms.

Here's why precision matters:

The FTC can't sue a company for "using dark patterns." But they can sue for:

  • Deceptive advertising (false scarcity)
  • Unfair trade practices (price obfuscation)
  • Violation of consent requirements (coerced disclosure)

Precise terminology makes it easier for regulators to:

  • Identify violations
  • Write enforceable rules
  • Hold companies accountable

Example: The California Privacy Rights Act (CPRA) doesn't ban "dark patterns"—it bans specific practices like:

  • "Steering consumers toward privacy-invasive options by presenting the options in a way that implies one choice is more desirable"
  • "Making the privacy-protective option more time-consuming or difficult"

That's precise. That's enforceable.

2. Design Team Accountability

When a PM asks, "Can we add this feature?" the conversation changes depending on the language used.

Vague version: PM: "Can we make the cancel button less prominent?" Designer: "Hmm, maybe. It's not that dark."

Precise version: PM: "Can we implement Forced Continuity by hiding the cancel flow?" Designer: "No. That's coercive and unethical."

Naming the intent makes it much harder to rationalize.

3. Educational Clarity

When teaching UX ethics, students need to understand:

  • What the manipulation is
  • How it works psychologically
  • Why it's harmful

"Dark pattern" doesn't provide that. But "Coerced Disclosure" does—it immediately tells you:

  • What's happening (disclosure is being forced)
  • How it works (through pressure or manipulation)
  • Why it's harmful (users aren't making free choices)

Building an Ethical Design Vocabulary

If we're going to move beyond "dark patterns," we need a shared vocabulary.

Here's a proposed taxonomy of 7 categories of deceptive design, named by intent:

CategoryIntentExample
Forced ContinuityMake it difficult to exit or cancelHidden cancel button, phone-only cancellation
Misdirection of ActionMake users think they're doing X when they're actually doing Y"Download" button that installs bloatware
Price ObfuscationHide or obscure the true costHidden fees at checkout, excluding taxes from displayed price
Deceptive Use of UrgencyCreate false scarcity or time pressureFake countdown timers, "Only 2 left!" when 50 are available
Coerced DisclosureTrick or pressure users into sharing more dataCookie walls, "Accept All" being the only visible option
Guilt-Based RejectionUse shame to manipulate acceptance"No thanks, I don't want to save money"
Undisclosed AdditionAdd items without clear consentPre-checked upsells, auto-added fees

This isn't exhaustive, but it's a start.

The key is: every term describes a specific manipulation of user agency, cognition, or understanding.


The Designer's Responsibility

As designers, we have a choice:

We can hide behind vague language like "growth hacking" or "conversion optimization" and pretend we're not manipulating users.

Or we can use precise language that names what we're actually doing.

Ask yourself:

  • Am I using Price Obfuscation to make the product seem cheaper than it is?
  • Am I using Misdirection of Action to get users to accept terms they wouldn't normally agree to?
  • Am I using Deceptive Use of Urgency to create false pressure?

When you name the intent, it becomes much harder to justify.

And that's the point.


The Connection to Broader Ethical Design

This isn't just about dark patterns—it's about building a culture of intentional, transparent, user-respecting design.

Precise language is part of that.

If you're interested in ethical design frameworks, check out my post on The Ethics of AI in Design, where I discuss the Responsible AI Design Framework (Transparency, Control, Fairness) and how to apply it to AI-driven personalization.

The principles are the same:

  • Name what you're doing
  • Be transparent about intent
  • Give users real control

Conclusion: Let's Retire the Cute, Vague Names

"Dark patterns" was a useful rallying cry in 2010. It brought attention to deceptive design.

But now we need to evolve the conversation.

Because when you call something a "dark pattern," you're describing how it feels. And feelings are subjective. You can argue about whether something "feels" dark.

But when you call it Forced Continuity, Price Obfuscation, or Coerced Disclosure, you're describing what it does. And actions are objective.

The question shifts from:

  • "Is this too dark?" (subjective, easy to rationalize)

To:

  • "Are we employing Price Obfuscation to meet our revenue target?" (objective, much harder to justify)

And that's a question every product team should be forced to answer honestly.


Want to learn more about ethical design?


Have thoughts on ethical design terminology? What deceptive patterns have you encountered—and what would you call them? Let's build this vocabulary together.

Simanta Parida

About the Author

Simanta Parida is a Product Designer at Siemens, Bengaluru, specializing in enterprise UX and B2B product design. With a background as an entrepreneur, he brings a unique perspective to designing intuitive tools for complex workflows.

Connect on LinkedIn →

Sources & Citations

No external citations have been attached to this article yet.

Citation template: add 3-5 primary sources (research papers, standards, official docs, or first-party case data) with direct links.